Apr 16 17:59:37.905217 ip-10-0-137-213 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 17:59:37.905230 ip-10-0-137-213 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 17:59:37.905239 ip-10-0-137-213 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 17:59:37.905543 ip-10-0-137-213 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 17:59:48.154289 ip-10-0-137-213 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 17:59:48.154308 ip-10-0-137-213 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f087ec870d414cd0ab41ec762e4fa7b0 -- Apr 16 18:01:58.560580 ip-10-0-137-213 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:01:59.038314 ip-10-0-137-213 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:01:59.038314 ip-10-0-137-213 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:01:59.038314 ip-10-0-137-213 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:01:59.038314 ip-10-0-137-213 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:01:59.038314 ip-10-0-137-213 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:01:59.041296 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.041186 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:01:59.045183 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045167 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:01:59.045183 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045183 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045187 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045190 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045193 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045196 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045203 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045206 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045210 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045212 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045215 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045218 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045220 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045237 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045242 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045245 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045250 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045254 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045258 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045261 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:01:59.045305 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045264 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045266 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045269 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045271 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045274 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045276 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045280 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045282 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045285 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045287 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045290 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045292 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045295 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045297 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045300 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045305 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045308 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045311 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045314 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045317 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:01:59.045834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045320 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045324 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045328 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045331 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045334 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045337 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045340 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045343 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045345 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045348 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045351 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045353 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045356 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045358 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045362 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045364 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045367 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045369 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045372 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:01:59.046334 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045375 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045377 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045380 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045382 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045384 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045387 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045389 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045391 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045395 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045398 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045402 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045406 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045409 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045412 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045414 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045417 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045421 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045423 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045426 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045428 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:01:59.046791 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045431 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045433 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045436 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045438 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045441 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045443 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045446 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045840 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045844 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045847 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045850 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045853 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045855 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045858 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045861 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045863 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045866 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045868 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045871 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045874 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045877 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:01:59.047291 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045881 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045884 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045887 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045889 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045892 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045894 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045897 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045899 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045902 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045904 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045907 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045909 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045912 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045914 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045917 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045919 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045922 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045924 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045927 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045930 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:01:59.047799 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045932 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045934 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045937 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045939 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045942 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045944 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045947 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045949 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045952 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045954 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045957 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045962 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045965 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045969 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045971 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045974 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045976 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045979 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045981 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045984 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:01:59.048341 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045986 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045989 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045991 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045993 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045996 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.045998 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046001 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046003 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046005 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046008 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046010 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046012 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046015 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046017 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046020 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046022 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046025 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046027 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046030 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:01:59.048824 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046032 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046035 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046038 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046042 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046046 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046049 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046052 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046055 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046057 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046060 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046062 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046064 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.046067 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047373 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047387 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047394 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047398 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047403 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047406 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047410 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:01:59.049310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047414 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047418 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047421 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047424 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047427 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047431 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047434 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047436 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047439 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047442 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047445 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047448 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047452 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047455 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047458 2576 flags.go:64] FLAG: --config-dir="" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047460 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047464 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047468 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047471 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047474 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047478 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047482 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047485 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047488 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047491 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:01:59.049801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047494 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047498 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047501 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047504 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047507 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047510 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047514 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047519 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047522 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047525 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047528 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047531 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047535 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047538 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047540 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047544 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047546 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047549 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047552 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047555 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047557 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047560 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047563 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047567 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047570 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:01:59.050393 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047573 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047576 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047579 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047583 2576 flags.go:64] FLAG: --help="false" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047586 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047589 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047591 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047596 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047600 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047603 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047606 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047608 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047611 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047614 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047617 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047620 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047623 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047626 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047629 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047632 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047635 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047638 2576 flags.go:64] FLAG: --lock-file="" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047640 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047644 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:01:59.050983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047647 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047655 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047658 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047661 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047664 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047666 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047670 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047673 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047676 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047680 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047683 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047687 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047690 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047693 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047696 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047700 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047703 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047706 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047709 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047716 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047719 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047722 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047725 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:01:59.051563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047728 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047734 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047737 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047740 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047743 2576 flags.go:64] FLAG: --port="10250" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047746 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047749 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01c39f4a6778eae5c" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047752 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047755 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047758 2576 flags.go:64] FLAG: --register-node="true" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047760 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047763 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047767 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047769 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047772 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047775 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047779 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047782 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047785 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047788 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047791 2576 flags.go:64] FLAG: --runonce="false" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047794 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047798 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047801 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047805 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047808 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:01:59.052135 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047811 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047814 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047817 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047820 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047823 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047826 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047829 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047832 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047834 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047837 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047843 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047845 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047848 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047852 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047855 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047857 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047860 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047863 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047866 2576 flags.go:64] FLAG: --v="2" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047870 2576 flags.go:64] FLAG: --version="false" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047874 2576 flags.go:64] FLAG: --vmodule="" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047878 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.047882 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047975 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:01:59.052766 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047979 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047982 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047985 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047988 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047990 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.047997 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048004 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048006 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048018 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048021 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048024 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048026 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048029 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048032 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048034 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048037 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048039 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048042 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048044 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048047 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:01:59.053380 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048049 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048052 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048055 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048057 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048060 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048062 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048065 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048067 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048070 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048072 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048075 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048077 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048079 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048082 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048084 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048087 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048089 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048091 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048096 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048099 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:01:59.053874 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048101 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048104 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048106 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048108 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048111 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048113 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048116 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048118 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048121 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048123 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048126 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048128 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048130 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048133 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048135 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048137 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048140 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048142 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048145 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048147 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:01:59.054421 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048150 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048155 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048158 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048160 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048163 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048167 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048170 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048173 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048176 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048179 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048183 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048188 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048191 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048194 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048197 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048199 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048201 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048204 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048207 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:01:59.055205 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048209 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048212 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048214 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048217 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048219 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.048234 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:01:59.055952 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.048944 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:01:59.057724 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.057705 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:01:59.057760 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.057726 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057777 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057783 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057786 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057789 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057792 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057794 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:01:59.057794 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057797 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057800 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057803 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057806 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057808 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057811 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057814 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057816 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057819 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057822 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057824 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057827 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057830 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057832 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057836 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057838 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057841 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057844 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057846 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:01:59.057970 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057849 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057851 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057854 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057859 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057862 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057864 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057867 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057869 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057872 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057874 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057877 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057879 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057881 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057884 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057887 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057890 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057893 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057895 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057899 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:01:59.058448 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057903 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057906 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057909 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057912 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057915 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057919 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057923 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057925 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057928 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057930 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057933 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057935 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057938 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057941 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057943 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057945 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057948 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057950 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057953 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:01:59.058957 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057956 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057958 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057961 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057963 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057966 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057968 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057971 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057973 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057977 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057980 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057982 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057985 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057988 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057991 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057993 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057996 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.057998 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058000 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058003 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058005 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:01:59.059432 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058008 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058010 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.058015 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058115 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058121 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058123 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058126 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058129 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058131 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058134 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058136 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058139 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058141 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058144 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058146 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058149 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:01:59.059944 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058152 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058154 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058157 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058159 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058162 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058165 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058167 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058171 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058175 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058178 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058181 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058184 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058186 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058189 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058191 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058194 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058196 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058199 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058202 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058204 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:01:59.060351 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058207 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058209 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058212 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058214 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058217 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058219 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058222 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058241 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058245 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058249 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058252 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058254 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058257 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058259 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058262 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058264 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058267 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058270 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058273 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:01:59.060830 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058275 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058279 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058283 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058287 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058290 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058293 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058295 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058298 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058301 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058303 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058306 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058309 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058311 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058314 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058317 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058319 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058322 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058324 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058327 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:01:59.061368 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058330 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058332 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058334 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058337 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058339 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058341 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058344 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058346 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058349 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058351 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058354 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058356 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058359 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058361 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:01:59.058364 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.058369 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:01:59.061838 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.059017 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:01:59.063087 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.063073 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:01:59.064043 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.064001 2576 server.go:1019] "Starting client certificate rotation" Apr 16 18:01:59.064131 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.064112 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:01:59.064170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.064149 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:01:59.088913 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.088890 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:01:59.091441 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.091418 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:01:59.110453 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.110422 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:01:59.115696 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.115679 2576 log.go:25] "Validated CRI v1 image API" Apr 16 18:01:59.117415 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.117398 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:01:59.122036 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.122017 2576 fs.go:135] Filesystem UUIDs: map[14787bf6-fd76-47a8-8aff-85695f8f0c66:/dev/nvme0n1p4 6b21a627-0c82-45fc-8b14-e95222e285a0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 18:01:59.122109 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.122053 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:01:59.128221 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.128106 2576 manager.go:217] Machine: {Timestamp:2026-04-16 18:01:59.126009711 +0000 UTC m=+0.438689194 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3103497 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cbc600a39891084683a69ddf592fc SystemUUID:ec2cbc60-0a39-8910-8468-3a69ddf592fc BootID:f087ec87-0d41-4cd0-ab41-ec762e4fa7b0 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c9:8a:cf:95:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c9:8a:cf:95:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:98:eb:be:82:51 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:01:59.128221 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.128214 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:01:59.128340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.128314 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:01:59.129505 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129482 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:01:59.129647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129506 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-213.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:01:59.129694 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129659 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:01:59.129694 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129667 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:01:59.129694 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129681 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:01:59.129789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129697 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:01:59.129789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.129769 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:01:59.131069 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.131057 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:01:59.131181 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.131172 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:01:59.133405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.133395 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:01:59.133447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.133413 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:01:59.133447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.133424 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:01:59.133447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.133434 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:01:59.133447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.133443 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:01:59.134547 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.134535 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:01:59.134595 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.134556 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:01:59.137622 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.137606 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:01:59.139513 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.139499 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:01:59.140936 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140911 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:01:59.140936 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140930 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:01:59.140936 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140936 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140942 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140953 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140962 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140971 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140980 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140990 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.140999 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.141019 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:01:59.141102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.141029 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:01:59.142547 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.142534 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:01:59.142547 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.142546 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:01:59.146329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146316 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:01:59.146379 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146354 2576 server.go:1295] "Started kubelet" Apr 16 18:01:59.146512 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146463 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:01:59.146562 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146507 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:01:59.146599 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146569 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:01:59.146789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.146771 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-213.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:01:59.146890 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.146865 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-213.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:01:59.147039 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.147021 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:01:59.147419 ip-10-0-137-213 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:01:59.148052 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.147995 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:01:59.148446 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.148433 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:01:59.156773 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.156749 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:01:59.157360 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.157343 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:01:59.158080 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158065 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:01:59.158080 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158082 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:01:59.158200 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158068 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:01:59.158458 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.158434 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.158458 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158439 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:01:59.158619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158466 2576 factory.go:55] Registering systemd factory Apr 16 18:01:59.158619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158481 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:01:59.158619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158573 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:01:59.158619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158581 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:01:59.158800 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158698 2576 factory.go:153] Registering CRI-O factory Apr 16 18:01:59.158800 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158711 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 18:01:59.158800 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158736 2576 factory.go:103] Registering Raw factory Apr 16 18:01:59.158800 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.158751 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 18:01:59.159165 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.159152 2576 manager.go:319] Starting recovery of all containers Apr 16 18:01:59.159375 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.158380 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-213.ec2.internal.18a6e84cb208f49f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-213.ec2.internal,UID:ip-10-0-137-213.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-213.ec2.internal,},FirstTimestamp:2026-04-16 18:01:59.146329247 +0000 UTC m=+0.459008734,LastTimestamp:2026-04-16 18:01:59.146329247 +0000 UTC m=+0.459008734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-213.ec2.internal,}" Apr 16 18:01:59.160507 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.160366 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:01:59.161673 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.161621 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-213.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:01:59.161878 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.161850 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:01:59.170111 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.169552 2576 manager.go:324] Recovery completed Apr 16 18:01:59.171417 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.171379 2576 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 18:01:59.174495 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.174481 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.176929 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.176904 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.176987 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.176950 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.176987 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.176963 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.177460 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.177446 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:01:59.177504 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.177461 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:01:59.177504 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.177479 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:01:59.179531 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.179461 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-213.ec2.internal.18a6e84cb3dbdd4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-213.ec2.internal,UID:ip-10-0-137-213.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-213.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-213.ec2.internal,},FirstTimestamp:2026-04-16 18:01:59.176928589 +0000 UTC m=+0.489608073,LastTimestamp:2026-04-16 18:01:59.176928589 +0000 UTC m=+0.489608073,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-213.ec2.internal,}" Apr 16 18:01:59.180153 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.180133 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jctrl" Apr 16 18:01:59.180310 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.180298 2576 policy_none.go:49] "None policy: Start" Apr 16 18:01:59.180349 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.180315 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:01:59.180349 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.180325 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:01:59.186904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.186887 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jctrl" Apr 16 18:01:59.188747 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.188688 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-213.ec2.internal.18a6e84cb3dc4771 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-213.ec2.internal,UID:ip-10-0-137-213.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-213.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-213.ec2.internal,},FirstTimestamp:2026-04-16 18:01:59.176955761 +0000 UTC m=+0.489635244,LastTimestamp:2026-04-16 18:01:59.176955761 +0000 UTC m=+0.489635244,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-213.ec2.internal,}" Apr 16 18:01:59.232617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.232577 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.233952 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.233987 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234010 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234021 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.234061 2576 kubelet.go:2475] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234405 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.234443 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234455 2576 server.go:85] "Starting device plugin registration server" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234745 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234756 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234861 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234952 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.234961 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.235482 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:01:59.237070 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.235522 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.238132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.238112 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:01:59.335109 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.335019 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal"] Apr 16 18:01:59.335270 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.335119 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.335270 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.335028 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.336355 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336334 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.336468 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336365 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.336468 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336379 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.336468 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336406 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.336468 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336336 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.336468 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336455 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.336682 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.336471 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.337829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.337812 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.337949 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.337933 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.337998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.337968 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.338521 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338505 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.338617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338533 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.338617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338547 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.338617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338512 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.338617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338599 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.338617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.338608 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.340185 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.340168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.340294 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.340201 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:01:59.340827 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.340811 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:01:59.340904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.340837 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:01:59.340904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.340847 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:01:59.344557 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.344538 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.344645 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.344561 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-213.ec2.internal\": node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.357879 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.357862 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.372911 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.372886 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-213.ec2.internal\" not found" node="ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.377392 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.377377 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-213.ec2.internal\" not found" node="ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.458983 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.458940 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.459063 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.458993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51f413f04a3f9e056bc3cfdb194d79d1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-213.ec2.internal\" (UID: \"51f413f04a3f9e056bc3cfdb194d79d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.459063 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.459015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.459063 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.459035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559337 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.559308 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.559436 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559353 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559436 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559436 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51f413f04a3f9e056bc3cfdb194d79d1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-213.ec2.internal\" (UID: \"51f413f04a3f9e056bc3cfdb194d79d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559541 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559541 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559482 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4cc9775d55b45f7f0c31dcee502aa5c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal\" (UID: \"f4cc9775d55b45f7f0c31dcee502aa5c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.559541 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.559460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/51f413f04a3f9e056bc3cfdb194d79d1-config\") pod \"kube-apiserver-proxy-ip-10-0-137-213.ec2.internal\" (UID: \"51f413f04a3f9e056bc3cfdb194d79d1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.660059 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.659993 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.676205 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.676172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.680629 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:01:59.680467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:01:59.760353 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.760316 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.860726 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.860696 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:01:59.961199 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:01:59.961136 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:02:00.053533 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.053502 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:00.061257 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:00.061235 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:02:00.064379 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.064363 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:02:00.064520 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.064503 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:00.064570 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.064537 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:02:00.157082 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.157056 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:02:00.161397 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:00.161373 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-213.ec2.internal\" not found" Apr 16 18:02:00.169483 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.169461 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:02:00.188690 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.188652 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:56:59 +0000 UTC" deadline="2028-01-18 03:04:05.874184068 +0000 UTC" Apr 16 18:02:00.188690 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.188691 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15393h2m5.685499566s" Apr 16 18:02:00.217519 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.217450 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:00.257205 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.257177 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fwncm" Apr 16 18:02:00.258237 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.258213 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" Apr 16 18:02:00.281417 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.281394 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fwncm" Apr 16 18:02:00.285150 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:00.285122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4cc9775d55b45f7f0c31dcee502aa5c.slice/crio-1aafbbf6ae4fcf85b0dbe8a78fae08ffdbf233c0cf69cda6989408a5cc3f4915 WatchSource:0}: Error finding container 1aafbbf6ae4fcf85b0dbe8a78fae08ffdbf233c0cf69cda6989408a5cc3f4915: Status 404 returned error can't find the container with id 1aafbbf6ae4fcf85b0dbe8a78fae08ffdbf233c0cf69cda6989408a5cc3f4915 Apr 16 18:02:00.285648 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:00.285631 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f413f04a3f9e056bc3cfdb194d79d1.slice/crio-e6b02f1175629da8dd38a03af9ef653cd9abb33f50d68b198f49a5858ae6ae00 WatchSource:0}: Error finding container e6b02f1175629da8dd38a03af9ef653cd9abb33f50d68b198f49a5858ae6ae00: Status 404 returned error can't find the container with id e6b02f1175629da8dd38a03af9ef653cd9abb33f50d68b198f49a5858ae6ae00 Apr 16 18:02:00.290555 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.290540 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:02:00.292869 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.292853 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:00.294666 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.294649 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" Apr 16 18:02:00.315697 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.315664 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:02:00.693868 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.693786 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:00.997350 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:00.997271 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:01.134477 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.134442 2576 apiserver.go:52] "Watching apiserver" Apr 16 18:02:01.152807 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.152770 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:02:01.154285 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.154253 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-ghw9q","openshift-network-operator/iptables-alerter-jql59","kube-system/konnectivity-agent-nj79n","kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal","openshift-image-registry/node-ca-vzsgf","openshift-multus/multus-additional-cni-plugins-gnj8w","openshift-multus/network-metrics-daemon-892g8","openshift-ovn-kubernetes/ovnkube-node-k7zg2","kube-system/global-pull-secret-syncer-brlng","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w","openshift-cluster-node-tuning-operator/tuned-74z96","openshift-dns/node-resolver-q59mj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal","openshift-multus/multus-dh898"] Apr 16 18:02:01.157646 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.157625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.160426 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.160400 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.160643 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.160624 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.160867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.160852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:02:01.160919 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.160875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.161039 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.161022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qktfc\"" Apr 16 18:02:01.162030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.161445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.162030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.161543 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dh898" Apr 16 18:02:01.163588 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.163570 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.163677 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.163609 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zxmdm\"" Apr 16 18:02:01.165008 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.164525 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.165008 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.164665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.165483 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.165464 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.165975 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.165956 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.166320 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.166300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9f7s6\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167057 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167143 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167272 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d8fxd\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167061 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167272 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167433 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167613 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6w628\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167729 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-modprobe-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-kubernetes\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-conf\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-sys\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-host\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-socket-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.167992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-device-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-systemd\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-tmp\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.168357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168130 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-registration-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-sys-fs\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpthp\" (UniqueName: \"kubernetes.io/projected/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kube-api-access-jpthp\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysconfig\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-run\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-lib-modules\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-var-lib-kubelet\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168426 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-tuned\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk99d\" (UniqueName: \"kubernetes.io/projected/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-kube-api-access-wk99d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168684 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.169170 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.168700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.169680 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.169312 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:02:01.169680 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.169547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pplkx\"" Apr 16 18:02:01.171033 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.170566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.171426 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.171407 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.171563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.171518 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:02:01.171741 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.171725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:02:01.171997 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.171978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.172110 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.172091 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:02:01.172529 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.172512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:01.172626 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.172582 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:01.172998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.172764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rtd5c\"" Apr 16 18:02:01.172998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.172811 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.172998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.172913 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c8lcb\"" Apr 16 18:02:01.174249 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.174113 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.174886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.174866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pcbgg\"" Apr 16 18:02:01.175188 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.174732 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:02:01.175290 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.175266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:02:01.175464 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.175403 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:02:01.176111 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.175972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:02:01.176542 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.176431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.176542 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.176465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.176542 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.176502 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:01.176542 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.176512 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:01.238720 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.238655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" event={"ID":"f4cc9775d55b45f7f0c31dcee502aa5c","Type":"ContainerStarted","Data":"1aafbbf6ae4fcf85b0dbe8a78fae08ffdbf233c0cf69cda6989408a5cc3f4915"} Apr 16 18:02:01.239852 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.239810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" event={"ID":"51f413f04a3f9e056bc3cfdb194d79d1","Type":"ContainerStarted","Data":"e6b02f1175629da8dd38a03af9ef653cd9abb33f50d68b198f49a5858ae6ae00"} Apr 16 18:02:01.259801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.259741 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:02:01.269535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cnibin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.269686 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.269758 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-sys\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.269758 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-sys\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.269861 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-dbus\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.269861 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1934f30b-41be-47ce-b2a7-9accbed71976-iptables-alerter-script\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.269861 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.270003 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cni-binary-copy\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270003 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.270003 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-systemd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270003 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.269995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-env-overrides\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovn-node-metrics-cert\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-etc-selinux\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.270178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-run\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-multus\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-kubelet\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-run\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-conf-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-ovn\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-bin\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-var-lib-kubelet\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk99d\" (UniqueName: \"kubernetes.io/projected/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-kube-api-access-wk99d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bd40f763-9175-4f8d-850e-89f05f5ff1b8-konnectivity-ca\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.270405 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-etc-kubernetes\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-var-lib-kubelet\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwxxh\" (UniqueName: \"kubernetes.io/projected/f342f33f-7ce1-4c45-a212-83b4c6fe1952-kube-api-access-jwxxh\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-conf\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-os-release\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-hostroot\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md96w\" (UniqueName: \"kubernetes.io/projected/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-kube-api-access-md96w\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cnibin\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-device-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysctl-conf\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.270796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-tmp\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-device-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njctl\" (UniqueName: \"kubernetes.io/projected/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-kube-api-access-njctl\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bd40f763-9175-4f8d-850e-89f05f5ff1b8-agent-certs\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-registration-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpthp\" (UniqueName: \"kubernetes.io/projected/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kube-api-access-jpthp\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-daemon-config\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-system-cni-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkf8c\" (UniqueName: \"kubernetes.io/projected/1934f30b-41be-47ce-b2a7-9accbed71976-kube-api-access-dkf8c\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-registration-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.270994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-slash\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1934f30b-41be-47ce-b2a7-9accbed71976-host-slash\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-etc-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-config\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-script-lib\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.271319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-modprobe-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-kubernetes\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-host\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85j2n\" (UniqueName: \"kubernetes.io/projected/446d8c35-b0da-42e5-a071-ea17b9747bb2-kube-api-access-85j2n\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-systemd-units\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-modprobe-d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-node-log\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-host\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-kubernetes\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-netd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-socket-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-systemd\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-socket-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-bin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-systemd\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-multus-certs\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:01.272086 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-kubelet\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271736 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-k8s-cni-cncf-io\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271766 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-system-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-os-release\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/798435e6-adbf-486f-bd1a-ba36ade6c8d3-hosts-file\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271971 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-socket-dir-parent\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.271997 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446d8c35-b0da-42e5-a071-ea17b9747bb2-host\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/446d8c35-b0da-42e5-a071-ea17b9747bb2-serviceca\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-var-lib-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/798435e6-adbf-486f-bd1a-ba36ade6c8d3-tmp-dir\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-kubelet-config\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.272798 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-netns\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272243 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-netns\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-log-socket\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpg7b\" (UniqueName: \"kubernetes.io/projected/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-kube-api-access-dpg7b\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flz9b\" (UniqueName: \"kubernetes.io/projected/798435e6-adbf-486f-bd1a-ba36ade6c8d3-kube-api-access-flz9b\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-sys-fs\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysconfig\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-sysconfig\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-lib-modules\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-sys-fs\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-tuned\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.273455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.272596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-lib-modules\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.275062 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.275042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-tmp\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.275139 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.275049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-etc-tuned\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.282088 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.282056 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:00 +0000 UTC" deadline="2028-01-22 16:21:55.873810586 +0000 UTC" Apr 16 18:02:01.282194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.282089 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15502h19m54.591725741s" Apr 16 18:02:01.284479 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.284423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk99d\" (UniqueName: \"kubernetes.io/projected/dec1db68-1de1-42a3-a8a4-3f8fa4f45b65-kube-api-access-wk99d\") pod \"tuned-74z96\" (UID: \"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65\") " pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.286691 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.286660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpthp\" (UniqueName: \"kubernetes.io/projected/f8fb6f5b-4a16-4820-b62a-60ccc1a0388f-kube-api-access-jpthp\") pod \"aws-ebs-csi-driver-node-nv76w\" (UID: \"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.373394 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bd40f763-9175-4f8d-850e-89f05f5ff1b8-agent-certs\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.373394 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-daemon-config\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-system-cni-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkf8c\" (UniqueName: \"kubernetes.io/projected/1934f30b-41be-47ce-b2a7-9accbed71976-kube-api-access-dkf8c\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-slash\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1934f30b-41be-47ce-b2a7-9accbed71976-host-slash\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-etc-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-system-cni-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-config\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-script-lib\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85j2n\" (UniqueName: \"kubernetes.io/projected/446d8c35-b0da-42e5-a071-ea17b9747bb2-kube-api-access-85j2n\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-systemd-units\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-node-log\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-netd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-bin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373709 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-slash\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-node-log\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-bin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-systemd-units\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1934f30b-41be-47ce-b2a7-9accbed71976-host-slash\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-etc-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-multus-certs\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-multus-certs\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-kubelet\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.373994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.373998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-k8s-cni-cncf-io\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-kubelet\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-system-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-daemon-config\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-os-release\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-k8s-cni-cncf-io\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/798435e6-adbf-486f-bd1a-ba36ade6c8d3-hosts-file\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-system-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-socket-dir-parent\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446d8c35-b0da-42e5-a071-ea17b9747bb2-host\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-socket-dir-parent\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/446d8c35-b0da-42e5-a071-ea17b9747bb2-serviceca\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-var-lib-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/798435e6-adbf-486f-bd1a-ba36ade6c8d3-tmp-dir\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/798435e6-adbf-486f-bd1a-ba36ade6c8d3-hosts-file\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.374749 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-kubelet-config\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-netd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-netns\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374480 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-netns\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-log-socket\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.374522 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpg7b\" (UniqueName: \"kubernetes.io/projected/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-kube-api-access-dpg7b\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flz9b\" (UniqueName: \"kubernetes.io/projected/798435e6-adbf-486f-bd1a-ba36ade6c8d3-kube-api-access-flz9b\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.374596 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:01.874566167 +0000 UTC m=+3.187245639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cnibin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-dbus\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1934f30b-41be-47ce-b2a7-9accbed71976-iptables-alerter-script\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/798435e6-adbf-486f-bd1a-ba36ade6c8d3-tmp-dir\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.375535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-run-netns\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-var-lib-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374839 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/446d8c35-b0da-42e5-a071-ea17b9747bb2-serviceca\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374850 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cni-binary-copy\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-netns\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/446d8c35-b0da-42e5-a071-ea17b9747bb2-host\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-kubelet-config\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-script-lib\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-systemd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-env-overrides\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovn-node-metrics-cert\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.374983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-multus\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-kubelet\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-conf-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-ovn\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-log-socket\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-bin\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bd40f763-9175-4f8d-850e-89f05f5ff1b8-konnectivity-ca\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-etc-kubernetes\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/1934f30b-41be-47ce-b2a7-9accbed71976-iptables-alerter-script\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwxxh\" (UniqueName: \"kubernetes.io/projected/f342f33f-7ce1-4c45-a212-83b4c6fe1952-kube-api-access-jwxxh\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-os-release\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-openvswitch\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-hostroot\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cni-binary-copy\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md96w\" (UniqueName: \"kubernetes.io/projected/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-kube-api-access-md96w\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cnibin\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njctl\" (UniqueName: \"kubernetes.io/projected/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-kube-api-access-njctl\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-os-release\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-cni-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-binary-copy\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.375666 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-cni-multus\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.376961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-cnibin\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.375712 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:01.875695849 +0000 UTC m=+3.188375320 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cnibin\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-host-var-lib-kubelet\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-etc-kubernetes\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-ovn\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-multus-conf-dir\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-os-release\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/734a1c0e-a532-48d0-9ded-1550c1e4391c-dbus\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-cni-bin\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.375947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-hostroot\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-run-systemd\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bd40f763-9175-4f8d-850e-89f05f5ff1b8-agent-certs\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovnkube-config\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-env-overrides\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.377647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.378210 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.376391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bd40f763-9175-4f8d-850e-89f05f5ff1b8-konnectivity-ca\") pod \"konnectivity-agent-nj79n\" (UID: \"bd40f763-9175-4f8d-850e-89f05f5ff1b8\") " pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.378350 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.378332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-ovn-node-metrics-cert\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.382443 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.382420 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:01.382443 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.382445 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:01.382786 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.382459 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:01.382786 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.382512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:01.882496488 +0000 UTC m=+3.195175965 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:01.384520 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.384496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flz9b\" (UniqueName: \"kubernetes.io/projected/798435e6-adbf-486f-bd1a-ba36ade6c8d3-kube-api-access-flz9b\") pod \"node-resolver-q59mj\" (UID: \"798435e6-adbf-486f-bd1a-ba36ade6c8d3\") " pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.384841 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.384811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85j2n\" (UniqueName: \"kubernetes.io/projected/446d8c35-b0da-42e5-a071-ea17b9747bb2-kube-api-access-85j2n\") pod \"node-ca-vzsgf\" (UID: \"446d8c35-b0da-42e5-a071-ea17b9747bb2\") " pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.385035 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.384965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkf8c\" (UniqueName: \"kubernetes.io/projected/1934f30b-41be-47ce-b2a7-9accbed71976-kube-api-access-dkf8c\") pod \"iptables-alerter-jql59\" (UID: \"1934f30b-41be-47ce-b2a7-9accbed71976\") " pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.385505 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.385489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwxxh\" (UniqueName: \"kubernetes.io/projected/f342f33f-7ce1-4c45-a212-83b4c6fe1952-kube-api-access-jwxxh\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.385996 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.385976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpg7b\" (UniqueName: \"kubernetes.io/projected/d16ce647-f47f-4f7b-9607-e47c6d4e67ce-kube-api-access-dpg7b\") pod \"ovnkube-node-k7zg2\" (UID: \"d16ce647-f47f-4f7b-9607-e47c6d4e67ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.386968 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.386946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njctl\" (UniqueName: \"kubernetes.io/projected/b9e0fd91-9b40-48e9-87ac-be0b97367fc5-kube-api-access-njctl\") pod \"multus-additional-cni-plugins-gnj8w\" (UID: \"b9e0fd91-9b40-48e9-87ac-be0b97367fc5\") " pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.386968 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.386966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md96w\" (UniqueName: \"kubernetes.io/projected/a787fe1a-b0bd-4485-b5c9-a196d280f7c1-kube-api-access-md96w\") pod \"multus-dh898\" (UID: \"a787fe1a-b0bd-4485-b5c9-a196d280f7c1\") " pod="openshift-multus/multus-dh898" Apr 16 18:02:01.469536 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.469503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" Apr 16 18:02:01.478416 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.478392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-74z96" Apr 16 18:02:01.490968 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.490946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:01.495618 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.495597 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dh898" Apr 16 18:02:01.502183 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.502164 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vzsgf" Apr 16 18:02:01.508776 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.508759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" Apr 16 18:02:01.515457 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.515395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:01.522046 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.522026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-q59mj" Apr 16 18:02:01.528633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.528614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jql59" Apr 16 18:02:01.879299 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.879192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:01.879299 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.879273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:01.879521 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.879382 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:01.879521 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.879425 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:01.879521 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.879459 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:02.879437982 +0000 UTC m=+4.192117454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:01.879521 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.879484 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:02.879467863 +0000 UTC m=+4.192147336 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:01.979871 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:01.979842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:01.980031 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.980004 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:01.980086 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.980037 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:01.980086 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.980047 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:01.980166 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:01.980110 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:02.980084278 +0000 UTC m=+4.292763749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:01.991065 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:01.991040 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e0fd91_9b40_48e9_87ac_be0b97367fc5.slice/crio-ca02860e5ac7e233ec8a28b5ff970ac4f5836e1f7b45cdc13885b4b9f2b888d3 WatchSource:0}: Error finding container ca02860e5ac7e233ec8a28b5ff970ac4f5836e1f7b45cdc13885b4b9f2b888d3: Status 404 returned error can't find the container with id ca02860e5ac7e233ec8a28b5ff970ac4f5836e1f7b45cdc13885b4b9f2b888d3 Apr 16 18:02:02.000849 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.000824 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec1db68_1de1_42a3_a8a4_3f8fa4f45b65.slice/crio-b946c1a59b19af5fa148e0f6f1d3c4fc1ef254c4258f314c93b60d0407187f19 WatchSource:0}: Error finding container b946c1a59b19af5fa148e0f6f1d3c4fc1ef254c4258f314c93b60d0407187f19: Status 404 returned error can't find the container with id b946c1a59b19af5fa148e0f6f1d3c4fc1ef254c4258f314c93b60d0407187f19 Apr 16 18:02:02.001663 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.001629 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd40f763_9175_4f8d_850e_89f05f5ff1b8.slice/crio-c6165b2e1bdca762e25fb08dcf74452e1ff1d9d5ba00dcfb81a28a6f3b2375be WatchSource:0}: Error finding container c6165b2e1bdca762e25fb08dcf74452e1ff1d9d5ba00dcfb81a28a6f3b2375be: Status 404 returned error can't find the container with id c6165b2e1bdca762e25fb08dcf74452e1ff1d9d5ba00dcfb81a28a6f3b2375be Apr 16 18:02:02.002626 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.002478 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16ce647_f47f_4f7b_9607_e47c6d4e67ce.slice/crio-335fa92f5ede250db825cbc9a5321d7b1ed6199847719dd03a08f0a55194542e WatchSource:0}: Error finding container 335fa92f5ede250db825cbc9a5321d7b1ed6199847719dd03a08f0a55194542e: Status 404 returned error can't find the container with id 335fa92f5ede250db825cbc9a5321d7b1ed6199847719dd03a08f0a55194542e Apr 16 18:02:02.003480 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.003460 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod446d8c35_b0da_42e5_a071_ea17b9747bb2.slice/crio-afb3f70f3b0f79019475eb96e5f86980447c179f6ab74fedc7ec07e709b76098 WatchSource:0}: Error finding container afb3f70f3b0f79019475eb96e5f86980447c179f6ab74fedc7ec07e709b76098: Status 404 returned error can't find the container with id afb3f70f3b0f79019475eb96e5f86980447c179f6ab74fedc7ec07e709b76098 Apr 16 18:02:02.004504 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.004467 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fb6f5b_4a16_4820_b62a_60ccc1a0388f.slice/crio-e5f0f2c3dd7b4e1607aa9524dffb1e5e56ed4e1b102a0bd234e30b8c92e17ea3 WatchSource:0}: Error finding container e5f0f2c3dd7b4e1607aa9524dffb1e5e56ed4e1b102a0bd234e30b8c92e17ea3: Status 404 returned error can't find the container with id e5f0f2c3dd7b4e1607aa9524dffb1e5e56ed4e1b102a0bd234e30b8c92e17ea3 Apr 16 18:02:02.005364 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.005210 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda787fe1a_b0bd_4485_b5c9_a196d280f7c1.slice/crio-16f021192f4db9dda31bda34cb1f6aba8fdd3f98cadf6e8ed7ea807878e69195 WatchSource:0}: Error finding container 16f021192f4db9dda31bda34cb1f6aba8fdd3f98cadf6e8ed7ea807878e69195: Status 404 returned error can't find the container with id 16f021192f4db9dda31bda34cb1f6aba8fdd3f98cadf6e8ed7ea807878e69195 Apr 16 18:02:02.006583 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:02:02.006561 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1934f30b_41be_47ce_b2a7_9accbed71976.slice/crio-736c8e8f328df67940eee8a1c22490a5cbc3bc5c6a4ea083a92184ad5e3a7eb5 WatchSource:0}: Error finding container 736c8e8f328df67940eee8a1c22490a5cbc3bc5c6a4ea083a92184ad5e3a7eb5: Status 404 returned error can't find the container with id 736c8e8f328df67940eee8a1c22490a5cbc3bc5c6a4ea083a92184ad5e3a7eb5 Apr 16 18:02:02.235099 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.234836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:02.235099 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.234898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:02.235628 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.235126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:02.235628 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.235169 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:02.242068 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.242031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"335fa92f5ede250db825cbc9a5321d7b1ed6199847719dd03a08f0a55194542e"} Apr 16 18:02:02.243495 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.243472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jql59" event={"ID":"1934f30b-41be-47ce-b2a7-9accbed71976","Type":"ContainerStarted","Data":"736c8e8f328df67940eee8a1c22490a5cbc3bc5c6a4ea083a92184ad5e3a7eb5"} Apr 16 18:02:02.244502 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.244479 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nj79n" event={"ID":"bd40f763-9175-4f8d-850e-89f05f5ff1b8","Type":"ContainerStarted","Data":"c6165b2e1bdca762e25fb08dcf74452e1ff1d9d5ba00dcfb81a28a6f3b2375be"} Apr 16 18:02:02.245511 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.245486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q59mj" event={"ID":"798435e6-adbf-486f-bd1a-ba36ade6c8d3","Type":"ContainerStarted","Data":"3fcbba5592c878494617becd4a7cfcb8b2c3476a58f00597dbfde65e97285dca"} Apr 16 18:02:02.246916 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.246890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" event={"ID":"51f413f04a3f9e056bc3cfdb194d79d1","Type":"ContainerStarted","Data":"1ed1d275693e918350b3eb8267a678b079f9982b855a2ead7462be7bc1dfd437"} Apr 16 18:02:02.248062 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.248041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dh898" event={"ID":"a787fe1a-b0bd-4485-b5c9-a196d280f7c1","Type":"ContainerStarted","Data":"16f021192f4db9dda31bda34cb1f6aba8fdd3f98cadf6e8ed7ea807878e69195"} Apr 16 18:02:02.248955 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.248931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vzsgf" event={"ID":"446d8c35-b0da-42e5-a071-ea17b9747bb2","Type":"ContainerStarted","Data":"afb3f70f3b0f79019475eb96e5f86980447c179f6ab74fedc7ec07e709b76098"} Apr 16 18:02:02.249782 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.249763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-74z96" event={"ID":"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65","Type":"ContainerStarted","Data":"b946c1a59b19af5fa148e0f6f1d3c4fc1ef254c4258f314c93b60d0407187f19"} Apr 16 18:02:02.250617 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.250600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerStarted","Data":"ca02860e5ac7e233ec8a28b5ff970ac4f5836e1f7b45cdc13885b4b9f2b888d3"} Apr 16 18:02:02.251382 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.251364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" event={"ID":"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f","Type":"ContainerStarted","Data":"e5f0f2c3dd7b4e1607aa9524dffb1e5e56ed4e1b102a0bd234e30b8c92e17ea3"} Apr 16 18:02:02.266398 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.266354 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-213.ec2.internal" podStartSLOduration=2.266341444 podStartE2EDuration="2.266341444s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:02.265987624 +0000 UTC m=+3.578667137" watchObservedRunningTime="2026-04-16 18:02:02.266341444 +0000 UTC m=+3.579020938" Apr 16 18:02:02.282410 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.282378 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:57:00 +0000 UTC" deadline="2027-12-02 02:31:54.198439217 +0000 UTC" Apr 16 18:02:02.282410 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.282409 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14264h29m51.916034619s" Apr 16 18:02:02.722705 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.722672 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:02:02.890157 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.890067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:02.890157 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.890129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:02.890413 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.890288 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.890413 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.890349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:04.890331568 +0000 UTC m=+6.203011055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:02.890761 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.890736 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:02.890867 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.890795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:04.890780368 +0000 UTC m=+6.203459853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:02.991695 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:02.990997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:02.991695 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.991196 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:02.991695 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.991216 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:02.991695 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.991245 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:02.991695 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:02.991310 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:04.991288786 +0000 UTC m=+6.303968263 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:03.234443 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:03.234370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:03.234597 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:03.234507 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:03.263030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:03.262991 2576 generic.go:358] "Generic (PLEG): container finished" podID="f4cc9775d55b45f7f0c31dcee502aa5c" containerID="b2178675096efede70b6de80408fefc7e612f5411f2700bc1707872f9942e696" exitCode=0 Apr 16 18:02:03.263573 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:03.263302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" event={"ID":"f4cc9775d55b45f7f0c31dcee502aa5c","Type":"ContainerDied","Data":"b2178675096efede70b6de80408fefc7e612f5411f2700bc1707872f9942e696"} Apr 16 18:02:04.235031 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.234997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:04.235253 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.235141 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:04.235590 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.235571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:04.235683 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.235662 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:04.283864 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.283796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" event={"ID":"f4cc9775d55b45f7f0c31dcee502aa5c","Type":"ContainerStarted","Data":"e5de4e38caea298baaa02fc4a589426a2fbbc2e72043991f6ba63560e434f679"} Apr 16 18:02:04.297997 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.297934 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-213.ec2.internal" podStartSLOduration=4.297910846 podStartE2EDuration="4.297910846s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:02:04.297482392 +0000 UTC m=+5.610161886" watchObservedRunningTime="2026-04-16 18:02:04.297910846 +0000 UTC m=+5.610590341" Apr 16 18:02:04.908342 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.908304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:04.908540 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:04.908363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:04.908540 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.908507 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:04.908648 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.908570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:08.908551503 +0000 UTC m=+10.221230981 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:04.908971 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.908946 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:04.909068 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:04.908994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:08.908979796 +0000 UTC m=+10.221659271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:05.009703 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:05.008831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:05.009703 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:05.009040 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:05.009703 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:05.009058 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:05.009703 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:05.009070 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:05.009703 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:05.009128 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:09.009110778 +0000 UTC m=+10.321790253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:05.234534 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:05.234455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:05.234683 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:05.234631 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:06.234494 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:06.234455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:06.234980 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:06.234585 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:06.234980 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:06.234916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:06.235075 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:06.234991 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:07.236413 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:07.236374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:07.236806 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:07.236525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:08.234824 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:08.234787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:08.234824 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:08.234828 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:08.235004 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.234924 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:08.235102 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.235044 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:08.940466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:08.940529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.940654 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.940718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:16.940697431 +0000 UTC m=+18.253376907 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.941065 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:08.941141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:08.941109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:16.941095322 +0000 UTC m=+18.253774794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:09.042377 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:09.041810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:09.042377 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:09.041973 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:09.042377 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:09.041988 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:09.042377 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:09.041999 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:09.042377 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:09.042058 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:17.042039066 +0000 UTC m=+18.354718541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:09.235412 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:09.235309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:09.236266 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:09.236174 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:10.234871 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:10.234836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:10.235335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:10.234845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:10.235335 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:10.234976 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:10.235335 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:10.235015 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:11.234488 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:11.234457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:11.234669 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:11.234574 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:12.235271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:12.235242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:12.235699 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:12.235367 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:12.235699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:12.235422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:12.235699 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:12.235540 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:13.234560 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:13.234519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:13.234735 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:13.234661 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:14.234489 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:14.234452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:14.234903 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:14.234566 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:14.234903 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:14.234638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:14.234903 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:14.234777 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:15.235273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:15.235218 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:15.235732 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:15.235355 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:16.234450 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:16.234415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:16.234450 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:16.234443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:16.234660 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:16.234516 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:16.234719 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:16.234695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:17.000536 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:17.000496 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:17.001106 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:17.000552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:17.001106 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.000664 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:17.001106 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.000664 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:17.001106 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.000736 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.000719499 +0000 UTC m=+34.313398989 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:17.001106 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.000757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.000746993 +0000 UTC m=+34.313426464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:17.101665 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:17.101620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:17.101838 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.101814 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:17.101898 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.101846 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:17.101898 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.101861 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:17.101987 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.101934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.101912683 +0000 UTC m=+34.414592160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:17.234371 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:17.234337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:17.234528 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:17.234477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:18.234535 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:18.234502 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:18.234995 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:18.234501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:18.234995 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:18.234641 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:18.234995 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:18.234695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:19.235828 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:19.235371 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:19.235828 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:19.235489 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:20.234877 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.234775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:20.235045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.234775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:20.235045 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:20.234933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:20.235045 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:20.234993 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:20.313830 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.313796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" event={"ID":"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f","Type":"ContainerStarted","Data":"85f5884be55ae0d4ba23522b280d367c667978cd747e68f077bba6d69a41bd2c"} Apr 16 18:02:20.316205 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316185 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:02:20.316462 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316439 2576 generic.go:358] "Generic (PLEG): container finished" podID="d16ce647-f47f-4f7b-9607-e47c6d4e67ce" containerID="de7f7a60dbbd02143fae98bb9bf22f8d669c3376894cb3fdb5d26c4edbb66021" exitCode=1 Apr 16 18:02:20.316519 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"132d8bd5461a39aaae72606e96ef3ed5832d53a49ad62fd8280fbeddc869f36d"} Apr 16 18:02:20.316557 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316532 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"e7e2f6af28a6c7ed0fd52b0d97faa8ecc2c9d6f205f0639d8a04368adb53a7aa"} Apr 16 18:02:20.316557 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"66233e8fe1dac8347f9b38717ad0e2515162733ecc004386b94b46c2e55361a4"} Apr 16 18:02:20.316631 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"44ad2a24b500c943c5d766b63f66e3d222e1bac9406fee8ea3f7910a744009c8"} Apr 16 18:02:20.316631 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerDied","Data":"de7f7a60dbbd02143fae98bb9bf22f8d669c3376894cb3fdb5d26c4edbb66021"} Apr 16 18:02:20.316631 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.316590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"64e3ca188bf0717a4474366f201d0c115f70a14f4dd5c876adec488580062c77"} Apr 16 18:02:20.317812 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.317791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nj79n" event={"ID":"bd40f763-9175-4f8d-850e-89f05f5ff1b8","Type":"ContainerStarted","Data":"9077955112a75dcd52b334150aa4545901b5c8f3a1bd7d3ff6398c9a0f9044c8"} Apr 16 18:02:20.319006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.318988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-q59mj" event={"ID":"798435e6-adbf-486f-bd1a-ba36ade6c8d3","Type":"ContainerStarted","Data":"7852e4911befda541af256dc4cc5e99e02be9bc408799f2b0913cb0ee22f9a75"} Apr 16 18:02:20.320359 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.320334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dh898" event={"ID":"a787fe1a-b0bd-4485-b5c9-a196d280f7c1","Type":"ContainerStarted","Data":"1187b2b20ebb581c17f7f6bbaf0c38f8dab9d114db0e7434fcb864e6882b8e36"} Apr 16 18:02:20.321602 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.321579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vzsgf" event={"ID":"446d8c35-b0da-42e5-a071-ea17b9747bb2","Type":"ContainerStarted","Data":"1fb71b3c2220c8604daedc683616c7538509851d7c7d66e3bf5a963c2c544c91"} Apr 16 18:02:20.322794 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.322770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-74z96" event={"ID":"dec1db68-1de1-42a3-a8a4-3f8fa4f45b65","Type":"ContainerStarted","Data":"16fd0850524a337cdf0b94f53f6b30ce02dcbf7403c41e47da898cfedba5c5f8"} Apr 16 18:02:20.324027 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.324001 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="488ad975f95f0e0bc953e01e53d192b603004b164f4919f44e2838b2180de6f1" exitCode=0 Apr 16 18:02:20.324104 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.324035 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"488ad975f95f0e0bc953e01e53d192b603004b164f4919f44e2838b2180de6f1"} Apr 16 18:02:20.350570 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.350512 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-74z96" podStartSLOduration=4.178769852 podStartE2EDuration="21.350498222s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.002849747 +0000 UTC m=+3.315529218" lastFinishedPulling="2026-04-16 18:02:19.17457811 +0000 UTC m=+20.487257588" observedRunningTime="2026-04-16 18:02:20.350361593 +0000 UTC m=+21.663041087" watchObservedRunningTime="2026-04-16 18:02:20.350498222 +0000 UTC m=+21.663177715" Apr 16 18:02:20.350870 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.350838 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nj79n" podStartSLOduration=4.182627687 podStartE2EDuration="21.350832045s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.004699859 +0000 UTC m=+3.317379332" lastFinishedPulling="2026-04-16 18:02:19.172904211 +0000 UTC m=+20.485583690" observedRunningTime="2026-04-16 18:02:20.33312857 +0000 UTC m=+21.645808086" watchObservedRunningTime="2026-04-16 18:02:20.350832045 +0000 UTC m=+21.663511537" Apr 16 18:02:20.364781 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.364743 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dh898" podStartSLOduration=4.182842236 podStartE2EDuration="21.364729453s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.007159567 +0000 UTC m=+3.319839040" lastFinishedPulling="2026-04-16 18:02:19.189046786 +0000 UTC m=+20.501726257" observedRunningTime="2026-04-16 18:02:20.364409164 +0000 UTC m=+21.677088658" watchObservedRunningTime="2026-04-16 18:02:20.364729453 +0000 UTC m=+21.677408959" Apr 16 18:02:20.379537 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.379478 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-q59mj" podStartSLOduration=4.198456882 podStartE2EDuration="21.379462996s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.000138406 +0000 UTC m=+3.312817890" lastFinishedPulling="2026-04-16 18:02:19.181144529 +0000 UTC m=+20.493824004" observedRunningTime="2026-04-16 18:02:20.379245138 +0000 UTC m=+21.691924635" watchObservedRunningTime="2026-04-16 18:02:20.379462996 +0000 UTC m=+21.692142490" Apr 16 18:02:20.409876 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.409810 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vzsgf" podStartSLOduration=4.243898149 podStartE2EDuration="21.409790037s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.006764571 +0000 UTC m=+3.319444043" lastFinishedPulling="2026-04-16 18:02:19.17265646 +0000 UTC m=+20.485335931" observedRunningTime="2026-04-16 18:02:20.409734384 +0000 UTC m=+21.722413876" watchObservedRunningTime="2026-04-16 18:02:20.409790037 +0000 UTC m=+21.722469529" Apr 16 18:02:20.497054 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.497026 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:02:20.886676 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:20.886591 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:21.051631 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.051598 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:21.052443 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.052415 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:21.234354 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.234278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:21.234501 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:21.234419 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:21.248574 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.248480 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:02:20.497047404Z","UUID":"79f96c0e-2698-4608-b6f6-a55686101ba9","Handler":null,"Name":"","Endpoint":""} Apr 16 18:02:21.250166 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.250146 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:02:21.250166 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.250171 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:02:21.328143 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.328104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" event={"ID":"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f","Type":"ContainerStarted","Data":"1e5dc1a21cc8acd298a70000c6afa4ccc3b315d0a70e8aca74e91a3d4ac4a281"} Apr 16 18:02:21.329970 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.329943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jql59" event={"ID":"1934f30b-41be-47ce-b2a7-9accbed71976","Type":"ContainerStarted","Data":"74b14c520612e79e3880e0f4e17738115eeda39e422471a43babe8ee81a29492"} Apr 16 18:02:21.330932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.330910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nj79n" Apr 16 18:02:21.365047 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:21.365003 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jql59" podStartSLOduration=5.200771805 podStartE2EDuration="22.364987783s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.008406574 +0000 UTC m=+3.321086048" lastFinishedPulling="2026-04-16 18:02:19.172622553 +0000 UTC m=+20.485302026" observedRunningTime="2026-04-16 18:02:21.364978926 +0000 UTC m=+22.677658420" watchObservedRunningTime="2026-04-16 18:02:21.364987783 +0000 UTC m=+22.677667278" Apr 16 18:02:22.234542 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.234512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:22.234542 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.234535 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:22.234739 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:22.234636 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:22.234791 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:22.234771 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:22.334204 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.333845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" event={"ID":"f8fb6f5b-4a16-4820-b62a-60ccc1a0388f","Type":"ContainerStarted","Data":"22e8df96eb70573bd4e6e119ada0810b6043b42defd8846bbca4b238d1c90d1a"} Apr 16 18:02:22.337576 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.337555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:02:22.337943 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.337903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"2ce19ee5c22ebc55ab1d83299514ad5fcd7196ccb9c1cc662ab24ac267305fae"} Apr 16 18:02:22.378079 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:22.377977 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nv76w" podStartSLOduration=3.763725435 podStartE2EDuration="23.377957116s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.007638188 +0000 UTC m=+3.320317662" lastFinishedPulling="2026-04-16 18:02:21.621869858 +0000 UTC m=+22.934549343" observedRunningTime="2026-04-16 18:02:22.377616821 +0000 UTC m=+23.690296335" watchObservedRunningTime="2026-04-16 18:02:22.377957116 +0000 UTC m=+23.690636610" Apr 16 18:02:23.234499 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:23.234462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:23.234696 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:23.234629 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:24.235246 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:24.235195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:24.235783 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:24.235199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:24.235783 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:24.235340 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:24.235783 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:24.235380 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:25.234815 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.234637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:25.234994 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:25.234903 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:25.345319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.345284 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="340460fe7b931305bdeede83a51621dd955954048516aadcedaad231312d9549" exitCode=0 Apr 16 18:02:25.346042 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.345375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"340460fe7b931305bdeede83a51621dd955954048516aadcedaad231312d9549"} Apr 16 18:02:25.348490 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.348412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:02:25.348743 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.348721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"cf4aee4e17650945e7394d97d684d11e932362a68b79f010e0eb44828679343c"} Apr 16 18:02:25.349006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.348992 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:25.349074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.349018 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:25.349176 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.349160 2576 scope.go:117] "RemoveContainer" containerID="de7f7a60dbbd02143fae98bb9bf22f8d669c3376894cb3fdb5d26c4edbb66021" Apr 16 18:02:25.364412 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:25.364393 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:26.235154 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.235118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:26.235317 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.235120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:26.235317 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:26.235215 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:26.235317 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:26.235310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:26.352244 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.352144 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="627833ba0dd05c9b85bb85800ef7aa026da9eef5c5a73e9e9842e46439eca33e" exitCode=0 Apr 16 18:02:26.352697 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.352247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"627833ba0dd05c9b85bb85800ef7aa026da9eef5c5a73e9e9842e46439eca33e"} Apr 16 18:02:26.355693 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.355675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:02:26.356056 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.356027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" event={"ID":"d16ce647-f47f-4f7b-9607-e47c6d4e67ce","Type":"ContainerStarted","Data":"4f8680a1fbeaa28d524374cdca8f211222f270eb184d81e45f407cc93031722c"} Apr 16 18:02:26.356365 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.356335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:26.371052 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.371031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:02:26.410903 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.410854 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" podStartSLOduration=9.939179535 podStartE2EDuration="27.410842503s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:02.005677603 +0000 UTC m=+3.318357076" lastFinishedPulling="2026-04-16 18:02:19.477340564 +0000 UTC m=+20.790020044" observedRunningTime="2026-04-16 18:02:26.410697374 +0000 UTC m=+27.723376866" watchObservedRunningTime="2026-04-16 18:02:26.410842503 +0000 UTC m=+27.723522013" Apr 16 18:02:26.423986 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.423961 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-brlng"] Apr 16 18:02:26.424128 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.424091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:26.424191 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:26.424173 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:26.426877 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.426848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ghw9q"] Apr 16 18:02:26.426990 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.426925 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:26.427047 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:26.426996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:26.429952 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.429933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-892g8"] Apr 16 18:02:26.430041 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:26.430009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:26.430099 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:26.430082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:27.359842 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:27.359749 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="be790b1f464c22a17fdd5067f6e69526a50af4f1fd9fe03fa419db79b11c6280" exitCode=0 Apr 16 18:02:27.359842 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:27.359832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"be790b1f464c22a17fdd5067f6e69526a50af4f1fd9fe03fa419db79b11c6280"} Apr 16 18:02:28.234911 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:28.234835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:28.234911 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:28.234870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:28.235109 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:28.234934 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:28.235109 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:28.234957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:28.235109 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:28.235049 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:28.235109 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:28.235104 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:29.237578 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:29.237421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:29.237980 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:29.237668 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:30.234510 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:30.234474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:30.234510 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:30.234499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:30.234724 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:30.234583 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:30.234781 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:30.234718 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:31.234831 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:31.234779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:31.235413 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:31.234925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:02:32.234902 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.234866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:32.234902 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.234891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:32.235518 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.234993 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ghw9q" podUID="310f5d23-e68e-46b7-808d-ca6cb602e572" Apr 16 18:02:32.235518 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.235073 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-brlng" podUID="734a1c0e-a532-48d0-9ded-1550c1e4391c" Apr 16 18:02:32.488922 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.488843 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-213.ec2.internal" event="NodeReady" Apr 16 18:02:32.489068 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.489017 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:02:32.544781 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.544741 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:02:32.557821 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.557797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.565983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.565957 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdkzb\"" Apr 16 18:02:32.566120 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.566031 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:02:32.566120 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.566109 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:02:32.567018 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.566746 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:02:32.567884 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.567570 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4zjt2"] Apr 16 18:02:32.573991 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.573971 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:02:32.574771 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.574754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.578064 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.578043 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:02:32.578435 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.578411 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:02:32.578435 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.578422 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:02:32.578569 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.578476 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:02:32.589641 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.589619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:02:32.590197 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.590176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4zjt2"] Apr 16 18:02:32.675113 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.675087 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j6hg6"] Apr 16 18:02:32.694455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.694427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.702315 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.702292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:02:32.702808 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.702780 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:02:32.702808 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.702795 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:02:32.710505 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.710486 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j6hg6"] Apr 16 18:02:32.719500 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.719628 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719628 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719628 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719628 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719628 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9s6h\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719690 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.719836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbqxd\" (UniqueName: \"kubernetes.io/projected/cec75d81-47b0-42a8-b1a3-27ed663fc255-kube-api-access-cbqxd\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.719836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.719750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.820904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.820904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.820904 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820886 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820913 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.820970 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.820994 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.821010 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.820971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.821057 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.321036904 +0000 UTC m=+34.633716375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.821081 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.321072345 +0000 UTC m=+34.633751820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:32.821141 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9s6h\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821191 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-config-volume\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6krmj\" (UniqueName: \"kubernetes.io/projected/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-kube-api-access-6krmj\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbqxd\" (UniqueName: \"kubernetes.io/projected/cec75d81-47b0-42a8-b1a3-27ed663fc255-kube-api-access-cbqxd\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821605 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-tmp-dir\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.821856 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821856 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.821929 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.821856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.825179 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.825155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.825294 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.825156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.831956 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.831925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.832060 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.832043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbqxd\" (UniqueName: \"kubernetes.io/projected/cec75d81-47b0-42a8-b1a3-27ed663fc255-kube-api-access-cbqxd\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:32.835685 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.835655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9s6h\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:32.922746 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.922704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-config-volume\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.922746 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.922741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6krmj\" (UniqueName: \"kubernetes.io/projected/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-kube-api-access-6krmj\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.922981 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.922781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-tmp-dir\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.922981 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.922825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.922981 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.922934 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:32.923115 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:32.922995 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:33.422980277 +0000 UTC m=+34.735659747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:32.923171 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.923124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-tmp-dir\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.923315 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.923298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-config-volume\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:32.935484 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:32.935461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6krmj\" (UniqueName: \"kubernetes.io/projected/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-kube-api-access-6krmj\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:33.023204 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.023166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:33.023204 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.023211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:33.023405 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.023328 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:33.023405 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.023356 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:33.023405 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.023405 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret podName:734a1c0e-a532-48d0-9ded-1550c1e4391c nodeName:}" failed. No retries permitted until 2026-04-16 18:03:05.023391575 +0000 UTC m=+66.336071046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret") pod "global-pull-secret-syncer-brlng" (UID: "734a1c0e-a532-48d0-9ded-1550c1e4391c") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:02:33.023541 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.023422 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:05.023410853 +0000 UTC m=+66.336090325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:02:33.124013 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.123937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:33.124156 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.124106 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:02:33.124156 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.124128 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:02:33.124156 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.124137 2576 projected.go:194] Error preparing data for projected volume kube-api-access-p4ps4 for pod openshift-network-diagnostics/network-check-target-ghw9q: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:33.124273 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.124190 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4 podName:310f5d23-e68e-46b7-808d-ca6cb602e572 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:05.124175068 +0000 UTC m=+66.436854540 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p4ps4" (UniqueName: "kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4") pod "network-check-target-ghw9q" (UID: "310f5d23-e68e-46b7-808d-ca6cb602e572") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:02:33.234350 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.234312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:02:33.241625 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.241601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:02:33.242358 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.241834 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:02:33.325875 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.325846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:33.326002 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.325989 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:33.326045 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.326005 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:33.326045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.326028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:33.326118 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.326049 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.32603381 +0000 UTC m=+35.638713280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:33.326167 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.326116 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:33.326199 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.326174 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.32615617 +0000 UTC m=+35.638835662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:33.373347 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.373309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerStarted","Data":"6736ac8ba02f2ec0b329cd86809c99fcea8575aff921e860471d30c5bb7ec87c"} Apr 16 18:02:33.426845 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:33.426756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:33.426995 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.426903 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:33.426995 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:33.426970 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:34.42695351 +0000 UTC m=+35.739632982 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:34.234870 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.234835 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:02:34.235044 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.234904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:02:34.241142 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.241118 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:02:34.241557 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.241538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:02:34.241659 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.241598 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:02:34.242446 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.242431 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j756c\"" Apr 16 18:02:34.334208 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.334166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:34.334208 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.334209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:34.334435 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.334346 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:34.334435 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.334391 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:34.334435 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.334408 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:34.334547 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.334410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:36.334393979 +0000 UTC m=+37.647073449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:34.334547 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.334460 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:02:36.334446417 +0000 UTC m=+37.647125889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:34.377755 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.377720 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="6736ac8ba02f2ec0b329cd86809c99fcea8575aff921e860471d30c5bb7ec87c" exitCode=0 Apr 16 18:02:34.377902 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.377790 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"6736ac8ba02f2ec0b329cd86809c99fcea8575aff921e860471d30c5bb7ec87c"} Apr 16 18:02:34.435089 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:34.435059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:34.435259 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.435241 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:34.435336 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:34.435304 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:36.435285804 +0000 UTC m=+37.747965277 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:35.381645 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:35.381610 2576 generic.go:358] "Generic (PLEG): container finished" podID="b9e0fd91-9b40-48e9-87ac-be0b97367fc5" containerID="81d0a34ff04a0e248242d84a668f78da71ddcebbc317e23f5620476cc77007b3" exitCode=0 Apr 16 18:02:35.382083 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:35.381666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerDied","Data":"81d0a34ff04a0e248242d84a668f78da71ddcebbc317e23f5620476cc77007b3"} Apr 16 18:02:36.349566 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:36.349330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:36.349566 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:36.349526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:36.349782 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.349470 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:36.349782 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.349636 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:36.349782 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.349646 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:36.349782 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.349664 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.349640165 +0000 UTC m=+41.662319636 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:36.349782 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.349683 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.349672243 +0000 UTC m=+41.662351714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:36.385971 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:36.385937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" event={"ID":"b9e0fd91-9b40-48e9-87ac-be0b97367fc5","Type":"ContainerStarted","Data":"b85dd6ad1b65b6b0e14c0ad70a5b371fe7fb259488888ae01b76decf8dd9707a"} Apr 16 18:02:36.423128 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:36.423088 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gnj8w" podStartSLOduration=6.245319788 podStartE2EDuration="37.423075032s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:02:01.994931236 +0000 UTC m=+3.307610708" lastFinishedPulling="2026-04-16 18:02:33.172686467 +0000 UTC m=+34.485365952" observedRunningTime="2026-04-16 18:02:36.422420053 +0000 UTC m=+37.735099564" watchObservedRunningTime="2026-04-16 18:02:36.423075032 +0000 UTC m=+37.735754524" Apr 16 18:02:36.450645 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:36.450619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:36.450757 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.450744 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:36.450811 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:36.450801 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:40.450789341 +0000 UTC m=+41.763468815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:40.381400 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:40.381361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:40.381400 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:40.381404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:40.381938 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.381524 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:40.381938 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.381604 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.381584288 +0000 UTC m=+49.694263779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:40.381938 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.381533 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:40.381938 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.381636 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:40.381938 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.381714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.381702994 +0000 UTC m=+49.694382466 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:40.482378 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:40.482344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:40.482547 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.482494 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:40.482595 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:40.482576 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:02:48.482556732 +0000 UTC m=+49.795236202 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:48.441968 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:48.441922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:02:48.441968 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:48.441971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:02:48.442549 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.442056 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:02:48.442549 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.442059 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:02:48.442549 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.442070 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:02:48.442549 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.442136 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:03:04.442120574 +0000 UTC m=+65.754800051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:02:48.442549 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.442160 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:04.442153682 +0000 UTC m=+65.754833153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:02:48.542654 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:48.542615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:02:48.542816 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.542762 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:02:48.542859 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:02:48.542820 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:03:04.542806335 +0000 UTC m=+65.855485807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:02:58.373045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:02:58.373017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-k7zg2" Apr 16 18:03:04.463723 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:04.463682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:03:04.463723 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:04.463729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:03:04.464269 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.463824 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:04.464269 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.463827 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:04.464269 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.463894 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.463878665 +0000 UTC m=+97.776558141 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:03:04.464269 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.463902 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:03:04.464269 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.463968 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.463954306 +0000 UTC m=+97.776633778 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:03:04.564401 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:04.564367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:03:04.564557 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.564481 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:04.564557 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:04.564539 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:03:36.564525133 +0000 UTC m=+97.877204605 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:03:05.068640 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.068605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:03:05.068640 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.068646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:03:05.071909 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.071895 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:03:05.072893 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.072880 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:03:05.079115 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:05.079088 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:03:05.079177 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:05.079152 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:09.079137793 +0000 UTC m=+130.391817263 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : secret "metrics-daemon-secret" not found Apr 16 18:03:05.082274 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.082221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/734a1c0e-a532-48d0-9ded-1550c1e4391c-original-pull-secret\") pod \"global-pull-secret-syncer-brlng\" (UID: \"734a1c0e-a532-48d0-9ded-1550c1e4391c\") " pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:03:05.145332 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.145297 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-brlng" Apr 16 18:03:05.169598 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.169562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:03:05.172965 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.172946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:03:05.183080 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.183055 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:03:05.193260 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.193236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4ps4\" (UniqueName: \"kubernetes.io/projected/310f5d23-e68e-46b7-808d-ca6cb602e572-kube-api-access-p4ps4\") pod \"network-check-target-ghw9q\" (UID: \"310f5d23-e68e-46b7-808d-ca6cb602e572\") " pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:03:05.278394 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.278363 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-brlng"] Apr 16 18:03:05.281611 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:03:05.281583 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod734a1c0e_a532_48d0_9ded_1550c1e4391c.slice/crio-5ab2cb629fcfe5e8f26048af3304269d93c161bc0d0227ddf87e8c5872e43fc2 WatchSource:0}: Error finding container 5ab2cb629fcfe5e8f26048af3304269d93c161bc0d0227ddf87e8c5872e43fc2: Status 404 returned error can't find the container with id 5ab2cb629fcfe5e8f26048af3304269d93c161bc0d0227ddf87e8c5872e43fc2 Apr 16 18:03:05.438248 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.438146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-brlng" event={"ID":"734a1c0e-a532-48d0-9ded-1550c1e4391c","Type":"ContainerStarted","Data":"5ab2cb629fcfe5e8f26048af3304269d93c161bc0d0227ddf87e8c5872e43fc2"} Apr 16 18:03:05.452421 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.452398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-j756c\"" Apr 16 18:03:05.459418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.459395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:03:05.596771 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:05.596730 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ghw9q"] Apr 16 18:03:05.601375 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:03:05.601345 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod310f5d23_e68e_46b7_808d_ca6cb602e572.slice/crio-ea3ecc7fe969e761766421a4d1b1bb5df8e0ce091c7bce960b3da00579f7f0cf WatchSource:0}: Error finding container ea3ecc7fe969e761766421a4d1b1bb5df8e0ce091c7bce960b3da00579f7f0cf: Status 404 returned error can't find the container with id ea3ecc7fe969e761766421a4d1b1bb5df8e0ce091c7bce960b3da00579f7f0cf Apr 16 18:03:06.441789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:06.441736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ghw9q" event={"ID":"310f5d23-e68e-46b7-808d-ca6cb602e572","Type":"ContainerStarted","Data":"ea3ecc7fe969e761766421a4d1b1bb5df8e0ce091c7bce960b3da00579f7f0cf"} Apr 16 18:03:10.450549 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:10.450509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-brlng" event={"ID":"734a1c0e-a532-48d0-9ded-1550c1e4391c","Type":"ContainerStarted","Data":"252b1e09dcba3501d577ceb8383b798a86cec35ea534ad4d1e78b54e3169c115"} Apr 16 18:03:10.451863 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:10.451840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ghw9q" event={"ID":"310f5d23-e68e-46b7-808d-ca6cb602e572","Type":"ContainerStarted","Data":"e4ae4e6fdc47e6fa8122ac20b18cbcd6092675eb55e637dbb585fa545e2a08b0"} Apr 16 18:03:10.451966 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:10.451957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:03:10.485396 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:10.485340 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-brlng" podStartSLOduration=65.940906096 podStartE2EDuration="1m10.485323461s" podCreationTimestamp="2026-04-16 18:02:00 +0000 UTC" firstStartedPulling="2026-04-16 18:03:05.283243023 +0000 UTC m=+66.595922496" lastFinishedPulling="2026-04-16 18:03:09.827660387 +0000 UTC m=+71.140339861" observedRunningTime="2026-04-16 18:03:10.465761082 +0000 UTC m=+71.778440575" watchObservedRunningTime="2026-04-16 18:03:10.485323461 +0000 UTC m=+71.798002951" Apr 16 18:03:10.485640 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:10.485609 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ghw9q" podStartSLOduration=67.265111223 podStartE2EDuration="1m11.485599972s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:03:05.602895697 +0000 UTC m=+66.915575168" lastFinishedPulling="2026-04-16 18:03:09.823384446 +0000 UTC m=+71.136063917" observedRunningTime="2026-04-16 18:03:10.485189716 +0000 UTC m=+71.797869208" watchObservedRunningTime="2026-04-16 18:03:10.485599972 +0000 UTC m=+71.798279464" Apr 16 18:03:36.508523 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:36.508491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:03:36.508523 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:36.508529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:03:36.508962 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.508630 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:03:36.508962 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.508631 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:03:36.508962 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.508641 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6c869d7d6-hdnf9: secret "image-registry-tls" not found Apr 16 18:03:36.508962 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.508699 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert podName:cec75d81-47b0-42a8-b1a3-27ed663fc255 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.508685596 +0000 UTC m=+161.821365072 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert") pod "ingress-canary-4zjt2" (UID: "cec75d81-47b0-42a8-b1a3-27ed663fc255") : secret "canary-serving-cert" not found Apr 16 18:03:36.508962 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.508714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls podName:7d9b3b2d-2a26-4003-848d-306ce8d13daa nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.508708389 +0000 UTC m=+161.821387859 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls") pod "image-registry-6c869d7d6-hdnf9" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa") : secret "image-registry-tls" not found Apr 16 18:03:36.609261 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:36.609207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:03:36.609425 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.609351 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:03:36.609425 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:03:36.609418 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls podName:7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf nodeName:}" failed. No retries permitted until 2026-04-16 18:04:40.609399068 +0000 UTC m=+161.922078540 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls") pod "dns-default-j6hg6" (UID: "7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf") : secret "dns-default-metrics-tls" not found Apr 16 18:03:41.456462 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:03:41.456427 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ghw9q" Apr 16 18:04:09.139834 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:09.139795 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:04:09.140330 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:09.139946 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:04:09.140330 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:09.140019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs podName:f342f33f-7ce1-4c45-a212-83b4c6fe1952 nodeName:}" failed. No retries permitted until 2026-04-16 18:06:11.140003498 +0000 UTC m=+252.452682969 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs") pod "network-metrics-daemon-892g8" (UID: "f342f33f-7ce1-4c45-a212-83b4c6fe1952") : secret "metrics-daemon-secret" not found Apr 16 18:04:23.187022 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.186984 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95"] Apr 16 18:04:23.191668 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.191640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" Apr 16 18:04:23.194463 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.194439 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.194594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.194537 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.194594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.194564 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-5996s\"" Apr 16 18:04:23.200034 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.199988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95"] Apr 16 18:04:23.237158 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.237129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d96cb\" (UniqueName: \"kubernetes.io/projected/f337f655-cd92-41f3-a722-832193387a64-kube-api-access-d96cb\") pod \"volume-data-source-validator-7d955d5dd4-qzg95\" (UID: \"f337f655-cd92-41f3-a722-832193387a64\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" Apr 16 18:04:23.295588 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.295552 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv"] Apr 16 18:04:23.298514 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.298486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.299990 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.299965 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r"] Apr 16 18:04:23.302721 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.302702 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qw5wn"] Apr 16 18:04:23.302847 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.302834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.305321 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.305299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-58fbdb974b-w74b9"] Apr 16 18:04:23.305446 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.305430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.308092 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.308077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.309519 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.309501 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:04:23.309616 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.309537 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.309985 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.309969 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:04:23.310166 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.310140 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.311074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.311054 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:04:23.311151 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.311059 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wwh78\"" Apr 16 18:04:23.318194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.317897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.318507 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318485 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:04:23.318601 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318497 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:04:23.318661 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:04:23.318851 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.318937 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318871 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.318937 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:04:23.318937 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.318899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:04:23.319097 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319083 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.319244 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319210 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:04:23.319311 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319254 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:04:23.319506 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319485 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:04:23.319723 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319692 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xjwf9\"" Apr 16 18:04:23.319820 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319805 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8sjss\"" Apr 16 18:04:23.319884 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319698 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hx9hq\"" Apr 16 18:04:23.319963 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.319943 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r"] Apr 16 18:04:23.321028 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.321008 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qw5wn"] Apr 16 18:04:23.326675 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.326657 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:04:23.331197 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.331174 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58fbdb974b-w74b9"] Apr 16 18:04:23.338175 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0720a490-adba-45e7-a242-c37073172c9a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.338295 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338189 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqlg2\" (UniqueName: \"kubernetes.io/projected/0720a490-adba-45e7-a242-c37073172c9a-kube-api-access-sqlg2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.338295 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbpt2\" (UniqueName: \"kubernetes.io/projected/4db8d207-3ea1-45c0-91a6-c173825a77bb-kube-api-access-hbpt2\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.338295 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ae1511-2083-4a0c-9d5a-a993e57fa083-serving-cert\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.338410 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338304 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.338410 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-stats-auth\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.338470 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-default-certificate\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.338470 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.338551 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.338551 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7fr7\" (UniqueName: \"kubernetes.io/projected/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-kube-api-access-s7fr7\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.338633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cn4c\" (UniqueName: \"kubernetes.io/projected/51ae1511-2083-4a0c-9d5a-a993e57fa083-kube-api-access-5cn4c\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.338633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0720a490-adba-45e7-a242-c37073172c9a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.338633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-tmp\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.338733 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d96cb\" (UniqueName: \"kubernetes.io/projected/f337f655-cd92-41f3-a722-832193387a64-kube-api-access-d96cb\") pod \"volume-data-source-validator-7d955d5dd4-qzg95\" (UID: \"f337f655-cd92-41f3-a722-832193387a64\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" Apr 16 18:04:23.338859 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.338859 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-snapshots\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.339102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.338883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.339805 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.339785 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv"] Apr 16 18:04:23.350157 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.350135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d96cb\" (UniqueName: \"kubernetes.io/projected/f337f655-cd92-41f3-a722-832193387a64-kube-api-access-d96cb\") pod \"volume-data-source-validator-7d955d5dd4-qzg95\" (UID: \"f337f655-cd92-41f3-a722-832193387a64\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" Apr 16 18:04:23.439751 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.439751 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7fr7\" (UniqueName: \"kubernetes.io/projected/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-kube-api-access-s7fr7\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.439751 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cn4c\" (UniqueName: \"kubernetes.io/projected/51ae1511-2083-4a0c-9d5a-a993e57fa083-kube-api-access-5cn4c\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440004 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0720a490-adba-45e7-a242-c37073172c9a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.440004 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-tmp\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440004 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.439834 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:04:23.440004 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.439907 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls podName:d0e00e46-92e2-4f3c-8bfe-85ee3c943462 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:23.939886531 +0000 UTC m=+145.252566007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls") pod "cluster-samples-operator-667775844f-l9v4r" (UID: "d0e00e46-92e2-4f3c-8bfe-85ee3c943462") : secret "samples-operator-tls" not found Apr 16 18:04:23.440004 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.439971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-snapshots\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0720a490-adba-45e7-a242-c37073172c9a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqlg2\" (UniqueName: \"kubernetes.io/projected/0720a490-adba-45e7-a242-c37073172c9a-kube-api-access-sqlg2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.440144 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:23.940120048 +0000 UTC m=+145.252799682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : configmap references non-existent config key: service-ca.crt Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbpt2\" (UniqueName: \"kubernetes.io/projected/4db8d207-3ea1-45c0-91a6-c173825a77bb-kube-api-access-hbpt2\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ae1511-2083-4a0c-9d5a-a993e57fa083-serving-cert\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-tmp\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440298 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.440328 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-stats-auth\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-default-certificate\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.440366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:23.940351778 +0000 UTC m=+145.253031252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : secret "router-metrics-certs-default" not found Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.440886 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0720a490-adba-45e7-a242-c37073172c9a-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.441252 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.440982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51ae1511-2083-4a0c-9d5a-a993e57fa083-snapshots\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.441363 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.441345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51ae1511-2083-4a0c-9d5a-a993e57fa083-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.442529 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.442505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0720a490-adba-45e7-a242-c37073172c9a-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.442839 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.442824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51ae1511-2083-4a0c-9d5a-a993e57fa083-serving-cert\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.443461 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.443437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-stats-auth\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.443553 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.443438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-default-certificate\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.488909 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.488882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cn4c\" (UniqueName: \"kubernetes.io/projected/51ae1511-2083-4a0c-9d5a-a993e57fa083-kube-api-access-5cn4c\") pod \"insights-operator-5785d4fcdd-qw5wn\" (UID: \"51ae1511-2083-4a0c-9d5a-a993e57fa083\") " pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.494527 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.494496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7fr7\" (UniqueName: \"kubernetes.io/projected/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-kube-api-access-s7fr7\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.494651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.494608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqlg2\" (UniqueName: \"kubernetes.io/projected/0720a490-adba-45e7-a242-c37073172c9a-kube-api-access-sqlg2\") pod \"kube-storage-version-migrator-operator-756bb7d76f-gwrcv\" (UID: \"0720a490-adba-45e7-a242-c37073172c9a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.496046 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.496029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbpt2\" (UniqueName: \"kubernetes.io/projected/4db8d207-3ea1-45c0-91a6-c173825a77bb-kube-api-access-hbpt2\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.500890 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.500869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" Apr 16 18:04:23.608137 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.608098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" Apr 16 18:04:23.619192 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.619156 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95"] Apr 16 18:04:23.622552 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.622524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" Apr 16 18:04:23.623458 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:23.623425 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf337f655_cd92_41f3_a722_832193387a64.slice/crio-c26fd9e6ce09ef2e1f1a107064be693c5d07324d691a4d837889e30e93efb28c WatchSource:0}: Error finding container c26fd9e6ce09ef2e1f1a107064be693c5d07324d691a4d837889e30e93efb28c: Status 404 returned error can't find the container with id c26fd9e6ce09ef2e1f1a107064be693c5d07324d691a4d837889e30e93efb28c Apr 16 18:04:23.739541 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.739506 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv"] Apr 16 18:04:23.742524 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:23.742494 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0720a490_adba_45e7_a242_c37073172c9a.slice/crio-9dbb99b1a8bb6b7c2fed13b95e95423cc440ada6e7a2a4ae9f9d935ae34d3ed8 WatchSource:0}: Error finding container 9dbb99b1a8bb6b7c2fed13b95e95423cc440ada6e7a2a4ae9f9d935ae34d3ed8: Status 404 returned error can't find the container with id 9dbb99b1a8bb6b7c2fed13b95e95423cc440ada6e7a2a4ae9f9d935ae34d3ed8 Apr 16 18:04:23.757831 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.757806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qw5wn"] Apr 16 18:04:23.760792 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:23.760759 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51ae1511_2083_4a0c_9d5a_a993e57fa083.slice/crio-64e0e3555dae5a727dc76548a0782937e0dd1a46572da7cab5c47f0bcdc2989b WatchSource:0}: Error finding container 64e0e3555dae5a727dc76548a0782937e0dd1a46572da7cab5c47f0bcdc2989b: Status 404 returned error can't find the container with id 64e0e3555dae5a727dc76548a0782937e0dd1a46572da7cab5c47f0bcdc2989b Apr 16 18:04:23.944285 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.944247 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.944447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.944301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:23.944447 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.944358 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.944338037 +0000 UTC m=+146.257017508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : configmap references non-existent config key: service-ca.crt Apr 16 18:04:23.944447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:23.944391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:23.944447 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.944409 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:04:23.944616 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.944455 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.944441741 +0000 UTC m=+146.257121212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : secret "router-metrics-certs-default" not found Apr 16 18:04:23.944616 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.944474 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:04:23.944616 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:23.944499 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls podName:d0e00e46-92e2-4f3c-8bfe-85ee3c943462 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:24.944491782 +0000 UTC m=+146.257171254 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls") pod "cluster-samples-operator-667775844f-l9v4r" (UID: "d0e00e46-92e2-4f3c-8bfe-85ee3c943462") : secret "samples-operator-tls" not found Apr 16 18:04:24.595338 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.595293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" event={"ID":"f337f655-cd92-41f3-a722-832193387a64","Type":"ContainerStarted","Data":"c26fd9e6ce09ef2e1f1a107064be693c5d07324d691a4d837889e30e93efb28c"} Apr 16 18:04:24.596681 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.596628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" event={"ID":"51ae1511-2083-4a0c-9d5a-a993e57fa083","Type":"ContainerStarted","Data":"64e0e3555dae5a727dc76548a0782937e0dd1a46572da7cab5c47f0bcdc2989b"} Apr 16 18:04:24.598193 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.598155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" event={"ID":"0720a490-adba-45e7-a242-c37073172c9a","Type":"ContainerStarted","Data":"9dbb99b1a8bb6b7c2fed13b95e95423cc440ada6e7a2a4ae9f9d935ae34d3ed8"} Apr 16 18:04:24.954626 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.954539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:24.954626 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.954606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:24.954846 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:24.954672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:24.954846 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:24.954765 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:04:24.954846 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:24.954803 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:04:24.954846 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:24.954778 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:26.95475413 +0000 UTC m=+148.267433605 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : configmap references non-existent config key: service-ca.crt Apr 16 18:04:24.954846 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:24.954845 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:26.954827766 +0000 UTC m=+148.267507238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : secret "router-metrics-certs-default" not found Apr 16 18:04:24.955059 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:24.954864 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls podName:d0e00e46-92e2-4f3c-8bfe-85ee3c943462 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:26.954857775 +0000 UTC m=+148.267537246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls") pod "cluster-samples-operator-667775844f-l9v4r" (UID: "d0e00e46-92e2-4f3c-8bfe-85ee3c943462") : secret "samples-operator-tls" not found Apr 16 18:04:25.602065 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:25.602025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" event={"ID":"f337f655-cd92-41f3-a722-832193387a64","Type":"ContainerStarted","Data":"cf8a718292e2c98be113211d98f72cb8e40e3009a1d301a97f3d9b0549666607"} Apr 16 18:04:25.630217 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:25.630160 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-qzg95" podStartSLOduration=1.279324378 podStartE2EDuration="2.630139959s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:23.626163469 +0000 UTC m=+144.938842940" lastFinishedPulling="2026-04-16 18:04:24.976979037 +0000 UTC m=+146.289658521" observedRunningTime="2026-04-16 18:04:25.628769999 +0000 UTC m=+146.941449491" watchObservedRunningTime="2026-04-16 18:04:25.630139959 +0000 UTC m=+146.942819551" Apr 16 18:04:26.606204 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.606161 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" event={"ID":"51ae1511-2083-4a0c-9d5a-a993e57fa083","Type":"ContainerStarted","Data":"cf6e074f319ac9eda53fc050061514264dda3562b4e613d9be6f263cd5d4477c"} Apr 16 18:04:26.607481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.607453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" event={"ID":"0720a490-adba-45e7-a242-c37073172c9a","Type":"ContainerStarted","Data":"1f34698826db846f0393a9c27c2918caf15e4ea2f1316c489374dc5dae7350ce"} Apr 16 18:04:26.628240 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.628190 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" podStartSLOduration=1.279364006 podStartE2EDuration="3.62817748s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:23.762430315 +0000 UTC m=+145.075109786" lastFinishedPulling="2026-04-16 18:04:26.111243776 +0000 UTC m=+147.423923260" observedRunningTime="2026-04-16 18:04:26.62654615 +0000 UTC m=+147.939225644" watchObservedRunningTime="2026-04-16 18:04:26.62817748 +0000 UTC m=+147.940856967" Apr 16 18:04:26.648073 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.648022 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" podStartSLOduration=1.283806696 podStartE2EDuration="3.648004735s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:23.744466505 +0000 UTC m=+145.057145976" lastFinishedPulling="2026-04-16 18:04:26.10866453 +0000 UTC m=+147.421344015" observedRunningTime="2026-04-16 18:04:26.647793465 +0000 UTC m=+147.960472959" watchObservedRunningTime="2026-04-16 18:04:26.648004735 +0000 UTC m=+147.960684250" Apr 16 18:04:26.972897 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.972805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:26.972897 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.972869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:26.972916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:26.972981 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:26.973034 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:26.973054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:30.973035307 +0000 UTC m=+152.285714780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : configmap references non-existent config key: service-ca.crt Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:26.973070 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:30.973063542 +0000 UTC m=+152.285743013 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : secret "router-metrics-certs-default" not found Apr 16 18:04:26.973094 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:26.973091 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls podName:d0e00e46-92e2-4f3c-8bfe-85ee3c943462 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:30.973073618 +0000 UTC m=+152.285753103 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls") pod "cluster-samples-operator-667775844f-l9v4r" (UID: "d0e00e46-92e2-4f3c-8bfe-85ee3c943462") : secret "samples-operator-tls" not found Apr 16 18:04:27.921322 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.921286 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q"] Apr 16 18:04:27.924503 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.924474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" Apr 16 18:04:27.929552 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.929528 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:04:27.929682 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.929530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:04:27.929682 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.929530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qjgkx\"" Apr 16 18:04:27.938823 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.938796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q"] Apr 16 18:04:27.981747 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:27.981704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-756n9\" (UniqueName: \"kubernetes.io/projected/7f544ce1-56ea-420f-b0f3-067278e84ad0-kube-api-access-756n9\") pod \"migrator-64d4d94569-p5x8q\" (UID: \"7f544ce1-56ea-420f-b0f3-067278e84ad0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" Apr 16 18:04:28.082057 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:28.082017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-756n9\" (UniqueName: \"kubernetes.io/projected/7f544ce1-56ea-420f-b0f3-067278e84ad0-kube-api-access-756n9\") pod \"migrator-64d4d94569-p5x8q\" (UID: \"7f544ce1-56ea-420f-b0f3-067278e84ad0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" Apr 16 18:04:28.094770 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:28.094740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-756n9\" (UniqueName: \"kubernetes.io/projected/7f544ce1-56ea-420f-b0f3-067278e84ad0-kube-api-access-756n9\") pod \"migrator-64d4d94569-p5x8q\" (UID: \"7f544ce1-56ea-420f-b0f3-067278e84ad0\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" Apr 16 18:04:28.234339 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:28.234211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" Apr 16 18:04:28.362038 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:28.362013 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q"] Apr 16 18:04:28.364311 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:28.364272 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f544ce1_56ea_420f_b0f3_067278e84ad0.slice/crio-7ec512024e577fb1c28d7c49407209b8717c981489d11ee2b3ad5840644dca99 WatchSource:0}: Error finding container 7ec512024e577fb1c28d7c49407209b8717c981489d11ee2b3ad5840644dca99: Status 404 returned error can't find the container with id 7ec512024e577fb1c28d7c49407209b8717c981489d11ee2b3ad5840644dca99 Apr 16 18:04:28.617768 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:28.617730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" event={"ID":"7f544ce1-56ea-420f-b0f3-067278e84ad0","Type":"ContainerStarted","Data":"7ec512024e577fb1c28d7c49407209b8717c981489d11ee2b3ad5840644dca99"} Apr 16 18:04:29.081251 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:29.081202 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q59mj_798435e6-adbf-486f-bd1a-ba36ade6c8d3/dns-node-resolver/0.log" Apr 16 18:04:29.621501 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:29.621404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" event={"ID":"7f544ce1-56ea-420f-b0f3-067278e84ad0","Type":"ContainerStarted","Data":"7f0b1cd4707451e12620f8b25fc851f5a60a90e45e0b02d85eb53e64c3e4cc48"} Apr 16 18:04:29.621501 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:29.621445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" event={"ID":"7f544ce1-56ea-420f-b0f3-067278e84ad0","Type":"ContainerStarted","Data":"5429b30f01de1f2ecacbc86a428dfc8238d98d428f74f02130b6560f4bf8ad9f"} Apr 16 18:04:29.638894 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:29.638848 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-p5x8q" podStartSLOduration=1.722201037 podStartE2EDuration="2.638832621s" podCreationTimestamp="2026-04-16 18:04:27 +0000 UTC" firstStartedPulling="2026-04-16 18:04:28.366297547 +0000 UTC m=+149.678977018" lastFinishedPulling="2026-04-16 18:04:29.282929131 +0000 UTC m=+150.595608602" observedRunningTime="2026-04-16 18:04:29.637372971 +0000 UTC m=+150.950052465" watchObservedRunningTime="2026-04-16 18:04:29.638832621 +0000 UTC m=+150.951512108" Apr 16 18:04:30.082084 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:30.082061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vzsgf_446d8c35-b0da-42e5-a071-ea17b9747bb2/node-ca/0.log" Apr 16 18:04:31.004983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:31.004945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:31.005132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:31.005042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:31.005132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:31.005086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:31.005132 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:31.005103 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:04:31.005328 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:31.005172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls podName:d0e00e46-92e2-4f3c-8bfe-85ee3c943462 nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.005151401 +0000 UTC m=+160.317830889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls") pod "cluster-samples-operator-667775844f-l9v4r" (UID: "d0e00e46-92e2-4f3c-8bfe-85ee3c943462") : secret "samples-operator-tls" not found Apr 16 18:04:31.005328 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:31.005193 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:04:31.005328 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:31.005222 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.005213604 +0000 UTC m=+160.317893075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : secret "router-metrics-certs-default" not found Apr 16 18:04:31.005328 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:31.005252 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle podName:4db8d207-3ea1-45c0-91a6-c173825a77bb nodeName:}" failed. No retries permitted until 2026-04-16 18:04:39.0052462 +0000 UTC m=+160.317925670 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle") pod "router-default-58fbdb974b-w74b9" (UID: "4db8d207-3ea1-45c0-91a6-c173825a77bb") : configmap references non-existent config key: service-ca.crt Apr 16 18:04:35.570278 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:35.570211 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" Apr 16 18:04:35.584441 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:35.584405 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4zjt2" podUID="cec75d81-47b0-42a8-b1a3-27ed663fc255" Apr 16 18:04:35.636747 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:35.636722 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:04:35.636891 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:35.636756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:04:35.704042 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:35.703996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j6hg6" podUID="7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf" Apr 16 18:04:36.245277 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:04:36.245137 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-892g8" podUID="f342f33f-7ce1-4c45-a212-83b4c6fe1952" Apr 16 18:04:36.638787 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:36.638757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:39.077986 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.077947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:39.078373 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.077993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:39.078373 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.078037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:39.079972 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.079951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4db8d207-3ea1-45c0-91a6-c173825a77bb-service-ca-bundle\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:39.080554 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.080522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4db8d207-3ea1-45c0-91a6-c173825a77bb-metrics-certs\") pod \"router-default-58fbdb974b-w74b9\" (UID: \"4db8d207-3ea1-45c0-91a6-c173825a77bb\") " pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:39.080648 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.080570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d0e00e46-92e2-4f3c-8bfe-85ee3c943462-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-l9v4r\" (UID: \"d0e00e46-92e2-4f3c-8bfe-85ee3c943462\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:39.215152 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.215109 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" Apr 16 18:04:39.226998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.226965 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:39.345531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.345458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r"] Apr 16 18:04:39.362832 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.362585 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58fbdb974b-w74b9"] Apr 16 18:04:39.365834 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:39.365808 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db8d207_3ea1_45c0_91a6_c173825a77bb.slice/crio-4f82d39334906b6509c834b1e34e35f70b72b09bcc9c2b487896d73eddc62e81 WatchSource:0}: Error finding container 4f82d39334906b6509c834b1e34e35f70b72b09bcc9c2b487896d73eddc62e81: Status 404 returned error can't find the container with id 4f82d39334906b6509c834b1e34e35f70b72b09bcc9c2b487896d73eddc62e81 Apr 16 18:04:39.646486 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.646391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" event={"ID":"d0e00e46-92e2-4f3c-8bfe-85ee3c943462","Type":"ContainerStarted","Data":"c71b84a39f9eece8894250c230de779f23c93ef38387f67e06a8b15c5a5ee9e2"} Apr 16 18:04:39.647604 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.647576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58fbdb974b-w74b9" event={"ID":"4db8d207-3ea1-45c0-91a6-c173825a77bb","Type":"ContainerStarted","Data":"769ec30c19ae5c828fbd873673eb3f54985401ee18e1f470d15b6ec588e9af19"} Apr 16 18:04:39.647604 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.647603 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58fbdb974b-w74b9" event={"ID":"4db8d207-3ea1-45c0-91a6-c173825a77bb","Type":"ContainerStarted","Data":"4f82d39334906b6509c834b1e34e35f70b72b09bcc9c2b487896d73eddc62e81"} Apr 16 18:04:39.669144 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:39.669099 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-58fbdb974b-w74b9" podStartSLOduration=16.669089072 podStartE2EDuration="16.669089072s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:39.667291654 +0000 UTC m=+160.979971146" watchObservedRunningTime="2026-04-16 18:04:39.669089072 +0000 UTC m=+160.981768564" Apr 16 18:04:40.227099 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.227060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:40.230026 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.229999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:40.591350 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.591312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:04:40.591548 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.591385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:04:40.594054 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.594028 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cec75d81-47b0-42a8-b1a3-27ed663fc255-cert\") pod \"ingress-canary-4zjt2\" (UID: \"cec75d81-47b0-42a8-b1a3-27ed663fc255\") " pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:04:40.594421 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.594403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"image-registry-6c869d7d6-hdnf9\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:04:40.651464 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.651424 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:40.652841 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.652818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-58fbdb974b-w74b9" Apr 16 18:04:40.692015 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.691986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:40.694657 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.694634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf-metrics-tls\") pod \"dns-default-j6hg6\" (UID: \"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf\") " pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:40.741140 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.741097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-hdkzb\"" Apr 16 18:04:40.741140 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.741121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-p7xs6\"" Apr 16 18:04:40.748730 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.748702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4zjt2" Apr 16 18:04:40.748851 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.748735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:04:40.843034 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.842975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-lh6dc\"" Apr 16 18:04:40.850077 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.850050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:40.991259 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:40.991197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4zjt2"] Apr 16 18:04:41.000911 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.000861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:04:41.002876 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:41.002017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcec75d81_47b0_42a8_b1a3_27ed663fc255.slice/crio-de55a8a0a2e26e5241d765b3a63da1527b240b13bfb86d24a95f2f0cdc6796a8 WatchSource:0}: Error finding container de55a8a0a2e26e5241d765b3a63da1527b240b13bfb86d24a95f2f0cdc6796a8: Status 404 returned error can't find the container with id de55a8a0a2e26e5241d765b3a63da1527b240b13bfb86d24a95f2f0cdc6796a8 Apr 16 18:04:41.004774 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:41.004745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d9b3b2d_2a26_4003_848d_306ce8d13daa.slice/crio-39314a1512ff159f699f034740da0353ad26e9698e612fc041022587bb1a1bca WatchSource:0}: Error finding container 39314a1512ff159f699f034740da0353ad26e9698e612fc041022587bb1a1bca: Status 404 returned error can't find the container with id 39314a1512ff159f699f034740da0353ad26e9698e612fc041022587bb1a1bca Apr 16 18:04:41.056462 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.056437 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j6hg6"] Apr 16 18:04:41.059389 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:41.059363 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb34a32_b49e_4b16_85ae_ffdf01b7c5bf.slice/crio-3154219884ddc215d093629517016977bcb5ff477b6bacb0054c1411fee9465c WatchSource:0}: Error finding container 3154219884ddc215d093629517016977bcb5ff477b6bacb0054c1411fee9465c: Status 404 returned error can't find the container with id 3154219884ddc215d093629517016977bcb5ff477b6bacb0054c1411fee9465c Apr 16 18:04:41.656386 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.656346 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" event={"ID":"7d9b3b2d-2a26-4003-848d-306ce8d13daa","Type":"ContainerStarted","Data":"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87"} Apr 16 18:04:41.656829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.656392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" event={"ID":"7d9b3b2d-2a26-4003-848d-306ce8d13daa","Type":"ContainerStarted","Data":"39314a1512ff159f699f034740da0353ad26e9698e612fc041022587bb1a1bca"} Apr 16 18:04:41.656829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.656413 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:04:41.658415 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.658377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" event={"ID":"d0e00e46-92e2-4f3c-8bfe-85ee3c943462","Type":"ContainerStarted","Data":"8a9c9bb6a99c1d5302fd8eb8dcc56e5782d828497b27dc7be747b7b58e273c0b"} Apr 16 18:04:41.658415 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.658414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" event={"ID":"d0e00e46-92e2-4f3c-8bfe-85ee3c943462","Type":"ContainerStarted","Data":"91c8b08796f35b445ba5b26b3bfd7336350caf80e58c11a0e685c04707375e41"} Apr 16 18:04:41.659623 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.659597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4zjt2" event={"ID":"cec75d81-47b0-42a8-b1a3-27ed663fc255","Type":"ContainerStarted","Data":"de55a8a0a2e26e5241d765b3a63da1527b240b13bfb86d24a95f2f0cdc6796a8"} Apr 16 18:04:41.661187 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.661167 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j6hg6" event={"ID":"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf","Type":"ContainerStarted","Data":"3154219884ddc215d093629517016977bcb5ff477b6bacb0054c1411fee9465c"} Apr 16 18:04:41.701290 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.700688 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" podStartSLOduration=162.700676497 podStartE2EDuration="2m42.700676497s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:04:41.681437784 +0000 UTC m=+162.994117301" watchObservedRunningTime="2026-04-16 18:04:41.700676497 +0000 UTC m=+163.013355989" Apr 16 18:04:41.701290 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:41.700788 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-l9v4r" podStartSLOduration=17.204621744 podStartE2EDuration="18.700783646s" podCreationTimestamp="2026-04-16 18:04:23 +0000 UTC" firstStartedPulling="2026-04-16 18:04:39.394271595 +0000 UTC m=+160.706951066" lastFinishedPulling="2026-04-16 18:04:40.890433481 +0000 UTC m=+162.203112968" observedRunningTime="2026-04-16 18:04:41.700039959 +0000 UTC m=+163.012719452" watchObservedRunningTime="2026-04-16 18:04:41.700783646 +0000 UTC m=+163.013463138" Apr 16 18:04:43.669032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.668993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j6hg6" event={"ID":"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf","Type":"ContainerStarted","Data":"df6b61f9283d4c2c86cd0e555726f39fd59f29a8fb3234159633404ee21b4e48"} Apr 16 18:04:43.669032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.669034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j6hg6" event={"ID":"7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf","Type":"ContainerStarted","Data":"414a17543d8957f2547403196d4fd2f1c1eff1e7698478704c5b1b013d4e7304"} Apr 16 18:04:43.669513 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.669084 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:43.670361 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.670343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4zjt2" event={"ID":"cec75d81-47b0-42a8-b1a3-27ed663fc255","Type":"ContainerStarted","Data":"85509aa09cb7ff431b9a4454a99d24846fd7695f43734c5b086f2b3be3f48796"} Apr 16 18:04:43.689591 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.689531 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j6hg6" podStartSLOduration=129.908754811 podStartE2EDuration="2m11.689521256s" podCreationTimestamp="2026-04-16 18:02:32 +0000 UTC" firstStartedPulling="2026-04-16 18:04:41.061895007 +0000 UTC m=+162.374574482" lastFinishedPulling="2026-04-16 18:04:42.842661453 +0000 UTC m=+164.155340927" observedRunningTime="2026-04-16 18:04:43.68802469 +0000 UTC m=+165.000704184" watchObservedRunningTime="2026-04-16 18:04:43.689521256 +0000 UTC m=+165.002200744" Apr 16 18:04:43.713961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:43.713923 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4zjt2" podStartSLOduration=129.872697568 podStartE2EDuration="2m11.713911291s" podCreationTimestamp="2026-04-16 18:02:32 +0000 UTC" firstStartedPulling="2026-04-16 18:04:41.004521947 +0000 UTC m=+162.317201433" lastFinishedPulling="2026-04-16 18:04:42.845735682 +0000 UTC m=+164.158415156" observedRunningTime="2026-04-16 18:04:43.713344254 +0000 UTC m=+165.026023757" watchObservedRunningTime="2026-04-16 18:04:43.713911291 +0000 UTC m=+165.026590784" Apr 16 18:04:50.234858 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:50.234752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:04:52.269822 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.269790 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n"] Apr 16 18:04:52.274433 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.274406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:52.278483 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.278451 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2xd6z\"" Apr 16 18:04:52.278483 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.278458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:04:52.284776 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.284754 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-smccb"] Apr 16 18:04:52.287817 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.287796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n"] Apr 16 18:04:52.287927 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.287914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:04:52.290206 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.290185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:04:52.291805 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.291783 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:04:52.291909 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.291809 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-mssqj\"" Apr 16 18:04:52.296965 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.296799 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tzzx2"] Apr 16 18:04:52.299970 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.299953 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.302481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.302463 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:04:52.304139 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.304123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vblvg\"" Apr 16 18:04:52.304614 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.304595 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:04:52.304988 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.304967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-smccb"] Apr 16 18:04:52.322939 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.322913 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tzzx2"] Apr 16 18:04:52.381885 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.381857 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6f516ce2-9a74-4204-b836-c2af2e82507e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fxd9n\" (UID: \"6f516ce2-9a74-4204-b836-c2af2e82507e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:52.382019 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.381892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7074862a-ed98-49ff-8632-fc350bb47fe1-crio-socket\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.382019 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.381911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbtzs\" (UniqueName: \"kubernetes.io/projected/939eb83f-31cd-4a31-9e2b-68288a7cbf8d-kube-api-access-kbtzs\") pod \"downloads-586b57c7b4-smccb\" (UID: \"939eb83f-31cd-4a31-9e2b-68288a7cbf8d\") " pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:04:52.382019 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.382006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fxsr\" (UniqueName: \"kubernetes.io/projected/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-api-access-8fxsr\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.382132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.382037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7074862a-ed98-49ff-8632-fc350bb47fe1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.382132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.382065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7074862a-ed98-49ff-8632-fc350bb47fe1-data-volume\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.382132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.382084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.482994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fxsr\" (UniqueName: \"kubernetes.io/projected/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-api-access-8fxsr\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7074862a-ed98-49ff-8632-fc350bb47fe1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483303 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7074862a-ed98-49ff-8632-fc350bb47fe1-data-volume\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483303 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483303 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6f516ce2-9a74-4204-b836-c2af2e82507e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fxd9n\" (UID: \"6f516ce2-9a74-4204-b836-c2af2e82507e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:52.483303 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7074862a-ed98-49ff-8632-fc350bb47fe1-crio-socket\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483303 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483206 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbtzs\" (UniqueName: \"kubernetes.io/projected/939eb83f-31cd-4a31-9e2b-68288a7cbf8d-kube-api-access-kbtzs\") pod \"downloads-586b57c7b4-smccb\" (UID: \"939eb83f-31cd-4a31-9e2b-68288a7cbf8d\") " pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:04:52.483511 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7074862a-ed98-49ff-8632-fc350bb47fe1-crio-socket\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483511 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7074862a-ed98-49ff-8632-fc350bb47fe1-data-volume\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.483770 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.483751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.485701 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.485675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7074862a-ed98-49ff-8632-fc350bb47fe1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.485838 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.485823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6f516ce2-9a74-4204-b836-c2af2e82507e-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-fxd9n\" (UID: \"6f516ce2-9a74-4204-b836-c2af2e82507e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:52.498177 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.498155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbtzs\" (UniqueName: \"kubernetes.io/projected/939eb83f-31cd-4a31-9e2b-68288a7cbf8d-kube-api-access-kbtzs\") pod \"downloads-586b57c7b4-smccb\" (UID: \"939eb83f-31cd-4a31-9e2b-68288a7cbf8d\") " pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:04:52.498329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.498308 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fxsr\" (UniqueName: \"kubernetes.io/projected/7074862a-ed98-49ff-8632-fc350bb47fe1-kube-api-access-8fxsr\") pod \"insights-runtime-extractor-tzzx2\" (UID: \"7074862a-ed98-49ff-8632-fc350bb47fe1\") " pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.583796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.583767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:52.596692 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.596669 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:04:52.610290 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.610249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tzzx2" Apr 16 18:04:52.728731 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.728611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n"] Apr 16 18:04:52.730813 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:52.730783 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f516ce2_9a74_4204_b836_c2af2e82507e.slice/crio-79774a2b769c0eea0614f05e4b105ac19431cd2ce33851e310030c2df2a64040 WatchSource:0}: Error finding container 79774a2b769c0eea0614f05e4b105ac19431cd2ce33851e310030c2df2a64040: Status 404 returned error can't find the container with id 79774a2b769c0eea0614f05e4b105ac19431cd2ce33851e310030c2df2a64040 Apr 16 18:04:52.753460 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.753433 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-smccb"] Apr 16 18:04:52.756366 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:52.756336 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod939eb83f_31cd_4a31_9e2b_68288a7cbf8d.slice/crio-494dcf1769eb52fb86be16df7a1dc533456e307b86f72bb50dc4cf89e5b5a35a WatchSource:0}: Error finding container 494dcf1769eb52fb86be16df7a1dc533456e307b86f72bb50dc4cf89e5b5a35a: Status 404 returned error can't find the container with id 494dcf1769eb52fb86be16df7a1dc533456e307b86f72bb50dc4cf89e5b5a35a Apr 16 18:04:52.771256 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.771213 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tzzx2"] Apr 16 18:04:52.774072 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:52.774048 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7074862a_ed98_49ff_8632_fc350bb47fe1.slice/crio-4739e27dcf85024db56d8489adfbf6d17f49fed5eb55950967a18fab9443718d WatchSource:0}: Error finding container 4739e27dcf85024db56d8489adfbf6d17f49fed5eb55950967a18fab9443718d: Status 404 returned error can't find the container with id 4739e27dcf85024db56d8489adfbf6d17f49fed5eb55950967a18fab9443718d Apr 16 18:04:52.983442 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.983362 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:04:52.986569 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.986551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:52.989330 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989307 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:04:52.989486 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:04:52.989486 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:04:52.989486 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989454 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:04:52.989703 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989689 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-87wwb\"" Apr 16 18:04:52.989735 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.989707 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:04:52.999545 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:52.999519 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:04:53.087961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.087929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.088102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.087964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.088102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.087996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.088102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.088041 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.088102 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.088062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.088260 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.088179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tqmw\" (UniqueName: \"kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.189576 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.189742 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tqmw\" (UniqueName: \"kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.190011 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.190011 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.190011 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.190011 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.189933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.191355 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.191328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.191755 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.191726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.191869 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.191729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.192552 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.192517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.192672 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.192654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.200340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.200320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tqmw\" (UniqueName: \"kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw\") pod \"console-74668b85bc-2jhmb\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.295284 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.295064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:04:53.448919 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.448863 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:04:53.494980 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:53.494941 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc1389cd_2c26_4408_9d07_aa77c03b4fed.slice/crio-d2b1473c0875bb6ac71fa309a1b55e7262d91544616b2a659f8dbf0ccd8ef232 WatchSource:0}: Error finding container d2b1473c0875bb6ac71fa309a1b55e7262d91544616b2a659f8dbf0ccd8ef232: Status 404 returned error can't find the container with id d2b1473c0875bb6ac71fa309a1b55e7262d91544616b2a659f8dbf0ccd8ef232 Apr 16 18:04:53.675415 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.675369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j6hg6" Apr 16 18:04:53.700381 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.700276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-smccb" event={"ID":"939eb83f-31cd-4a31-9e2b-68288a7cbf8d","Type":"ContainerStarted","Data":"494dcf1769eb52fb86be16df7a1dc533456e307b86f72bb50dc4cf89e5b5a35a"} Apr 16 18:04:53.701800 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.701765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74668b85bc-2jhmb" event={"ID":"fc1389cd-2c26-4408-9d07-aa77c03b4fed","Type":"ContainerStarted","Data":"d2b1473c0875bb6ac71fa309a1b55e7262d91544616b2a659f8dbf0ccd8ef232"} Apr 16 18:04:53.702894 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.702862 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" event={"ID":"6f516ce2-9a74-4204-b836-c2af2e82507e","Type":"ContainerStarted","Data":"79774a2b769c0eea0614f05e4b105ac19431cd2ce33851e310030c2df2a64040"} Apr 16 18:04:53.704801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.704778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tzzx2" event={"ID":"7074862a-ed98-49ff-8632-fc350bb47fe1","Type":"ContainerStarted","Data":"2f3e690a6e1d034f0524a2fb4c57d6dca47e9716b62098be57dd271b4951672c"} Apr 16 18:04:53.704908 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.704805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tzzx2" event={"ID":"7074862a-ed98-49ff-8632-fc350bb47fe1","Type":"ContainerStarted","Data":"3756b2234e14a3e014e473ee1eb75b8839389521e63b4c284c6a81018660cc85"} Apr 16 18:04:53.704908 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:53.704815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tzzx2" event={"ID":"7074862a-ed98-49ff-8632-fc350bb47fe1","Type":"ContainerStarted","Data":"4739e27dcf85024db56d8489adfbf6d17f49fed5eb55950967a18fab9443718d"} Apr 16 18:04:54.710851 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:54.710810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" event={"ID":"6f516ce2-9a74-4204-b836-c2af2e82507e","Type":"ContainerStarted","Data":"9f9979a3a92bb5dc0db0cfceaa8010d47fa9f80909587e08da85a454510fc515"} Apr 16 18:04:54.711505 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:54.711468 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:54.717893 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:54.717853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" Apr 16 18:04:54.730494 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:54.730337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-fxd9n" podStartSLOduration=1.428700116 podStartE2EDuration="2.730319977s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:52.732925014 +0000 UTC m=+174.045604498" lastFinishedPulling="2026-04-16 18:04:54.034544862 +0000 UTC m=+175.347224359" observedRunningTime="2026-04-16 18:04:54.727821032 +0000 UTC m=+176.040500526" watchObservedRunningTime="2026-04-16 18:04:54.730319977 +0000 UTC m=+176.042999471" Apr 16 18:04:55.831272 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.831238 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-5vqgz"] Apr 16 18:04:55.841998 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.841964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:55.849633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.849603 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:04:55.849791 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.849663 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:04:55.850754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.849913 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:04:55.850754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.850126 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-ktkfl\"" Apr 16 18:04:55.850754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.850337 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:04:55.850754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.850584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:04:55.851521 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:55.851244 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-5vqgz"] Apr 16 18:04:56.014658 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.014616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.014845 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.014672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.014845 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.014764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpczt\" (UniqueName: \"kubernetes.io/projected/a465148d-0ff9-4c68-b379-ed74bdf8a280-kube-api-access-lpczt\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.014952 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.014882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a465148d-0ff9-4c68-b379-ed74bdf8a280-metrics-client-ca\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.116793 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.116450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.116793 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.116517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.116793 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.116555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpczt\" (UniqueName: \"kubernetes.io/projected/a465148d-0ff9-4c68-b379-ed74bdf8a280-kube-api-access-lpczt\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.116793 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.116634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a465148d-0ff9-4c68-b379-ed74bdf8a280-metrics-client-ca\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.117817 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.117780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a465148d-0ff9-4c68-b379-ed74bdf8a280-metrics-client-ca\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.120462 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.119955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.120462 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.120424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a465148d-0ff9-4c68-b379-ed74bdf8a280-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.147431 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.147354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpczt\" (UniqueName: \"kubernetes.io/projected/a465148d-0ff9-4c68-b379-ed74bdf8a280-kube-api-access-lpczt\") pod \"prometheus-operator-78f957474d-5vqgz\" (UID: \"a465148d-0ff9-4c68-b379-ed74bdf8a280\") " pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:56.156047 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:56.155760 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" Apr 16 18:04:57.215022 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.214990 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-5vqgz"] Apr 16 18:04:57.216941 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:04:57.216910 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda465148d_0ff9_4c68_b379_ed74bdf8a280.slice/crio-7c38e0aca240afc157c485aa251546e466ed24e658fdce55493cd60882b3dd72 WatchSource:0}: Error finding container 7c38e0aca240afc157c485aa251546e466ed24e658fdce55493cd60882b3dd72: Status 404 returned error can't find the container with id 7c38e0aca240afc157c485aa251546e466ed24e658fdce55493cd60882b3dd72 Apr 16 18:04:57.723080 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.722995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" event={"ID":"a465148d-0ff9-4c68-b379-ed74bdf8a280","Type":"ContainerStarted","Data":"7c38e0aca240afc157c485aa251546e466ed24e658fdce55493cd60882b3dd72"} Apr 16 18:04:57.724909 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.724861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74668b85bc-2jhmb" event={"ID":"fc1389cd-2c26-4408-9d07-aa77c03b4fed","Type":"ContainerStarted","Data":"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21"} Apr 16 18:04:57.727786 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.727744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tzzx2" event={"ID":"7074862a-ed98-49ff-8632-fc350bb47fe1","Type":"ContainerStarted","Data":"3de4ea65cacc7ecbf2ad9b1c2c4255ca8c5f99291c0cd05993e1e0d6a8cbfb67"} Apr 16 18:04:57.747270 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.747202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74668b85bc-2jhmb" podStartSLOduration=1.816816451 podStartE2EDuration="5.74718743s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:53.49783059 +0000 UTC m=+174.810510074" lastFinishedPulling="2026-04-16 18:04:57.428201572 +0000 UTC m=+178.740881053" observedRunningTime="2026-04-16 18:04:57.746749156 +0000 UTC m=+179.059428660" watchObservedRunningTime="2026-04-16 18:04:57.74718743 +0000 UTC m=+179.059866920" Apr 16 18:04:57.771932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:57.770079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tzzx2" podStartSLOduration=1.540482158 podStartE2EDuration="5.770060588s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:52.844105874 +0000 UTC m=+174.156785346" lastFinishedPulling="2026-04-16 18:04:57.073684303 +0000 UTC m=+178.386363776" observedRunningTime="2026-04-16 18:04:57.768636061 +0000 UTC m=+179.081315554" watchObservedRunningTime="2026-04-16 18:04:57.770060588 +0000 UTC m=+179.082740081" Apr 16 18:04:58.733991 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:58.733903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" event={"ID":"a465148d-0ff9-4c68-b379-ed74bdf8a280","Type":"ContainerStarted","Data":"4bdcc1a81ede2569c90282aa56d48f771fb3100b85b88f74c93b0c8c072bc8de"} Apr 16 18:04:58.734478 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:58.733945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" event={"ID":"a465148d-0ff9-4c68-b379-ed74bdf8a280","Type":"ContainerStarted","Data":"87e3c3922c10710c3123365210d7d6052a26c924147b2bf9f0fcdd906ba4fea3"} Apr 16 18:04:58.753754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:04:58.753707 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-5vqgz" podStartSLOduration=2.424238924 podStartE2EDuration="3.753692552s" podCreationTimestamp="2026-04-16 18:04:55 +0000 UTC" firstStartedPulling="2026-04-16 18:04:57.219533108 +0000 UTC m=+178.532212588" lastFinishedPulling="2026-04-16 18:04:58.54898673 +0000 UTC m=+179.861666216" observedRunningTime="2026-04-16 18:04:58.751958005 +0000 UTC m=+180.064637535" watchObservedRunningTime="2026-04-16 18:04:58.753692552 +0000 UTC m=+180.066372044" Apr 16 18:05:00.269049 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.269007 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xp882"] Apr 16 18:05:00.274139 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.274112 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.277767 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.277724 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:05:00.277767 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.277750 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s82h7\"" Apr 16 18:05:00.277966 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.277891 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:05:00.277966 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.277928 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:05:00.457413 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-metrics-client-ca\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457607 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457607 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-tls\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457607 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457774 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-sys\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457774 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457677 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-textfile\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457774 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-wtmp\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457893 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gwg\" (UniqueName: \"kubernetes.io/projected/ad872cd2-8712-402a-9d8d-4742f86d316b-kube-api-access-p9gwg\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.457893 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.457795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-root\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559137 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559336 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-sys\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559336 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-textfile\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559336 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-wtmp\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559336 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-sys\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559336 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gwg\" (UniqueName: \"kubernetes.io/projected/ad872cd2-8712-402a-9d8d-4742f86d316b-kube-api-access-p9gwg\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559638 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-wtmp\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559716 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-textfile\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559871 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-root\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559934 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-metrics-client-ca\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.559986 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559955 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad872cd2-8712-402a-9d8d-4742f86d316b-root\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.560436 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.559964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.560570 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.560466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-tls\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.561176 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.561151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-metrics-client-ca\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.565924 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.562029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.565924 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.563119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.565924 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.563307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad872cd2-8712-402a-9d8d-4742f86d316b-node-exporter-tls\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.569368 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.569341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gwg\" (UniqueName: \"kubernetes.io/projected/ad872cd2-8712-402a-9d8d-4742f86d316b-kube-api-access-p9gwg\") pod \"node-exporter-xp882\" (UID: \"ad872cd2-8712-402a-9d8d-4742f86d316b\") " pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.585414 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.585382 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xp882" Apr 16 18:05:00.595699 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:05:00.595665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad872cd2_8712_402a_9d8d_4742f86d316b.slice/crio-ee0450a04f434d8dd60fb05050b19609fbf957d1d60094113f900e5b490d1fa1 WatchSource:0}: Error finding container ee0450a04f434d8dd60fb05050b19609fbf957d1d60094113f900e5b490d1fa1: Status 404 returned error can't find the container with id ee0450a04f434d8dd60fb05050b19609fbf957d1d60094113f900e5b490d1fa1 Apr 16 18:05:00.740566 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.740521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xp882" event={"ID":"ad872cd2-8712-402a-9d8d-4742f86d316b","Type":"ContainerStarted","Data":"ee0450a04f434d8dd60fb05050b19609fbf957d1d60094113f900e5b490d1fa1"} Apr 16 18:05:00.753619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.753564 2576 patch_prober.go:28] interesting pod/image-registry-6c869d7d6-hdnf9 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:05:00.753787 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:00.753639 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:05:02.669289 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:02.669261 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:05:02.749216 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:02.749179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xp882" event={"ID":"ad872cd2-8712-402a-9d8d-4742f86d316b","Type":"ContainerStarted","Data":"1713807ca9cef8afa26165e9c6de512d355c56805c91f30c74d4d2f58616fced"} Apr 16 18:05:03.295986 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:03.295937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:03.296332 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:03.296007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:03.302597 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:03.302574 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:03.756685 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:03.756655 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:04.986563 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:04.986525 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm"] Apr 16 18:05:04.992095 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:04.992071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:04.996444 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:04.996423 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-8k8pd\"" Apr 16 18:05:04.997583 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:04.997555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:05:05.001348 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.001326 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm"] Apr 16 18:05:05.005129 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.005064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92twm\" (UID: \"95273443-3fd0-40e6-8ba2-20dd9bafad0a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:05.106124 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.106046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92twm\" (UID: \"95273443-3fd0-40e6-8ba2-20dd9bafad0a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:05.106337 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:05:05.106216 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:05:05.106337 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:05:05.106322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert podName:95273443-3fd0-40e6-8ba2-20dd9bafad0a nodeName:}" failed. No retries permitted until 2026-04-16 18:05:05.606300783 +0000 UTC m=+186.918980260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-92twm" (UID: "95273443-3fd0-40e6-8ba2-20dd9bafad0a") : secret "monitoring-plugin-cert" not found Apr 16 18:05:05.610768 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.610709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92twm\" (UID: \"95273443-3fd0-40e6-8ba2-20dd9bafad0a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:05.613778 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.613746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95273443-3fd0-40e6-8ba2-20dd9bafad0a-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-92twm\" (UID: \"95273443-3fd0-40e6-8ba2-20dd9bafad0a\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:05.903171 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:05.903077 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:06.496728 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.496691 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:06.502335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.502263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.505414 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:05:06.505414 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505259 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eefgpp07cmo32\"" Apr 16 18:05:06.505414 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505302 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:05:06.506006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:05:06.506006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:05:06.506006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505879 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:05:06.506006 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.505901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:05:06.506275 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.506036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:05:06.506600 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.506581 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:05:06.506693 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.506629 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:05:06.506693 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.506668 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:05:06.506779 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.506587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sqg2h\"" Apr 16 18:05:06.513370 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.513345 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:05:06.514104 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.514082 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:05:06.515809 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.515782 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:05:06.520283 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520427 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520467 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520579 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520579 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65wf\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520682 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.520867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.520828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.521649 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.521624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:06.622055 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622213 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c65wf\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.622586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.623075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.623075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.622886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.623287 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.623250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.623395 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.623337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.624398 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.624377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.626548 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.626521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.626796 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.626701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.628060 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.627701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.628060 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.627993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.628060 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.628015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.628780 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.628740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.629009 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.628987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.629117 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.629102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.629350 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.629325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.629906 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.629884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.629906 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.629895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.630145 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.630023 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.631368 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.631344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.633268 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.633218 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65wf\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf\") pod \"prometheus-k8s-0\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:06.819626 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:06.819584 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:08.843561 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.843486 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:05:08.848727 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.848699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.858053 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.858017 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:05:08.858849 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.858804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:05:08.945708 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.945878 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.945878 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.945878 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.946049 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945902 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.946049 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.945959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc5ck\" (UniqueName: \"kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:08.946137 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:08.946059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047057 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fc5ck\" (UniqueName: \"kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.047273 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.047190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.048638 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.048608 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.048766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.048678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.048766 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.048738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.048872 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.048845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.050123 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.050099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.050123 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.050114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.059329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.059298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc5ck\" (UniqueName: \"kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck\") pod \"console-66bb55c95f-zcm47\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.161322 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.161214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:09.470214 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.470187 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:05:09.471741 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:05:09.471717 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14604c37_82f6_42cb_98de_9da3ccb24d89.slice/crio-382458950cbd91ce122f1920580fc4f50378e74af8fcdba6aba94d5b9b11dfc5 WatchSource:0}: Error finding container 382458950cbd91ce122f1920580fc4f50378e74af8fcdba6aba94d5b9b11dfc5: Status 404 returned error can't find the container with id 382458950cbd91ce122f1920580fc4f50378e74af8fcdba6aba94d5b9b11dfc5 Apr 16 18:05:09.652570 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.652537 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm"] Apr 16 18:05:09.655056 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:05:09.655023 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95273443_3fd0_40e6_8ba2_20dd9bafad0a.slice/crio-e17473de3727e4dafd09afdb13b81430f6a563798ada7ac5d3a92076c85e7896 WatchSource:0}: Error finding container e17473de3727e4dafd09afdb13b81430f6a563798ada7ac5d3a92076c85e7896: Status 404 returned error can't find the container with id e17473de3727e4dafd09afdb13b81430f6a563798ada7ac5d3a92076c85e7896 Apr 16 18:05:09.661197 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.661133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:05:09.663522 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:05:09.663501 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bb9d3d_67b1_4ab6_ae9d_a782b377fdb1.slice/crio-9d101f223c887941046f2724d0d90eb67f2355599de162346a903b22e4047338 WatchSource:0}: Error finding container 9d101f223c887941046f2724d0d90eb67f2355599de162346a903b22e4047338: Status 404 returned error can't find the container with id 9d101f223c887941046f2724d0d90eb67f2355599de162346a903b22e4047338 Apr 16 18:05:09.781067 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.781031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-smccb" event={"ID":"939eb83f-31cd-4a31-9e2b-68288a7cbf8d","Type":"ContainerStarted","Data":"2b92412139b24d7677496ddf8f2e365c75b7051f4e244480c4110969bdf21170"} Apr 16 18:05:09.781495 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.781461 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:05:09.782840 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.782809 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" event={"ID":"95273443-3fd0-40e6-8ba2-20dd9bafad0a","Type":"ContainerStarted","Data":"e17473de3727e4dafd09afdb13b81430f6a563798ada7ac5d3a92076c85e7896"} Apr 16 18:05:09.784921 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.784901 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad872cd2-8712-402a-9d8d-4742f86d316b" containerID="1713807ca9cef8afa26165e9c6de512d355c56805c91f30c74d4d2f58616fced" exitCode=0 Apr 16 18:05:09.785185 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.785050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xp882" event={"ID":"ad872cd2-8712-402a-9d8d-4742f86d316b","Type":"ContainerDied","Data":"1713807ca9cef8afa26165e9c6de512d355c56805c91f30c74d4d2f58616fced"} Apr 16 18:05:09.787083 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.787046 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"382458950cbd91ce122f1920580fc4f50378e74af8fcdba6aba94d5b9b11dfc5"} Apr 16 18:05:09.790264 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.790214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bb55c95f-zcm47" event={"ID":"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1","Type":"ContainerStarted","Data":"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9"} Apr 16 18:05:09.791047 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.790898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bb55c95f-zcm47" event={"ID":"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1","Type":"ContainerStarted","Data":"9d101f223c887941046f2724d0d90eb67f2355599de162346a903b22e4047338"} Apr 16 18:05:09.795662 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.795525 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-smccb" Apr 16 18:05:09.828553 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.828132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-smccb" podStartSLOduration=1.193961831 podStartE2EDuration="17.828104577s" podCreationTimestamp="2026-04-16 18:04:52 +0000 UTC" firstStartedPulling="2026-04-16 18:04:52.758184963 +0000 UTC m=+174.070864437" lastFinishedPulling="2026-04-16 18:05:09.392327711 +0000 UTC m=+190.705007183" observedRunningTime="2026-04-16 18:05:09.805288471 +0000 UTC m=+191.117967965" watchObservedRunningTime="2026-04-16 18:05:09.828104577 +0000 UTC m=+191.140784071" Apr 16 18:05:09.846607 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:09.846553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66bb55c95f-zcm47" podStartSLOduration=1.8465360849999999 podStartE2EDuration="1.846536085s" podCreationTimestamp="2026-04-16 18:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:05:09.846030701 +0000 UTC m=+191.158710194" watchObservedRunningTime="2026-04-16 18:05:09.846536085 +0000 UTC m=+191.159215556" Apr 16 18:05:10.797329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:10.797290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xp882" event={"ID":"ad872cd2-8712-402a-9d8d-4742f86d316b","Type":"ContainerStarted","Data":"1edb5b60d16a952a1e3cb7e09ffb77769b95399831241dbe0748112da1cbf26a"} Apr 16 18:05:10.797329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:10.797332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xp882" event={"ID":"ad872cd2-8712-402a-9d8d-4742f86d316b","Type":"ContainerStarted","Data":"6d0e8639dbe0da4c73a1b1117d2298341d9c7e975c863dfd5ade3dff7f498dff"} Apr 16 18:05:10.824262 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:10.824186 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xp882" podStartSLOduration=9.369056745 podStartE2EDuration="10.824167428s" podCreationTimestamp="2026-04-16 18:05:00 +0000 UTC" firstStartedPulling="2026-04-16 18:05:00.59766302 +0000 UTC m=+181.910342504" lastFinishedPulling="2026-04-16 18:05:02.052773712 +0000 UTC m=+183.365453187" observedRunningTime="2026-04-16 18:05:10.821862557 +0000 UTC m=+192.134542051" watchObservedRunningTime="2026-04-16 18:05:10.824167428 +0000 UTC m=+192.136846921" Apr 16 18:05:11.801862 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.801826 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="ad3e063a8d22be4aa974f74182af76551aff8d02650c41121c0e7b8a72115bae" exitCode=0 Apr 16 18:05:11.802343 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.801924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"ad3e063a8d22be4aa974f74182af76551aff8d02650c41121c0e7b8a72115bae"} Apr 16 18:05:11.803808 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.803723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" event={"ID":"95273443-3fd0-40e6-8ba2-20dd9bafad0a","Type":"ContainerStarted","Data":"40cf0c858866a1b155fe6de78a2da4617a66f91d1130d8830d732b9ea56c6f89"} Apr 16 18:05:11.804097 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.804037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:11.810146 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.810120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" Apr 16 18:05:11.870316 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:11.870251 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-92twm" podStartSLOduration=6.20848129 podStartE2EDuration="7.870206003s" podCreationTimestamp="2026-04-16 18:05:04 +0000 UTC" firstStartedPulling="2026-04-16 18:05:09.657146042 +0000 UTC m=+190.969825514" lastFinishedPulling="2026-04-16 18:05:11.318870755 +0000 UTC m=+192.631550227" observedRunningTime="2026-04-16 18:05:11.869826637 +0000 UTC m=+193.182506137" watchObservedRunningTime="2026-04-16 18:05:11.870206003 +0000 UTC m=+193.182885497" Apr 16 18:05:14.840917 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:14.840821 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:05:15.822398 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:15.822317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"dcb58e4e2f1ab824ed2909e023b32a8a86440b3c901068ae80d488bf4957624e"} Apr 16 18:05:15.822398 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:15.822365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"c5ac7e91ec683c72878e167707699ce9005241904235052177117bab40c44f00"} Apr 16 18:05:18.835881 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:18.835846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"539712fb9465ef88ffbb6d52c46c496d828fad7db140cc0186c076bd50c391e9"} Apr 16 18:05:18.836315 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:18.835887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"70dc8de984f2e2d2c4f79caad12b9c1ce5da505043a7962d6c0f3d92582dce30"} Apr 16 18:05:18.836315 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:18.835900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"4255d707c5816a732eaf6d54d29fe24006579ab128e345bd72662ec70a601445"} Apr 16 18:05:18.836315 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:18.835913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerStarted","Data":"c308b3cd6872dc158c926decb751cfbcfd8d53c911ac7af798bbf00c3fdf2844"} Apr 16 18:05:19.162132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.162089 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:19.162132 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.162143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:19.167695 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.167668 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:19.845035 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.844997 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:05:19.886448 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.886394 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.136573089 podStartE2EDuration="13.8863788s" podCreationTimestamp="2026-04-16 18:05:06 +0000 UTC" firstStartedPulling="2026-04-16 18:05:09.474066585 +0000 UTC m=+190.786746056" lastFinishedPulling="2026-04-16 18:05:18.223872286 +0000 UTC m=+199.536551767" observedRunningTime="2026-04-16 18:05:19.882725433 +0000 UTC m=+201.195405163" watchObservedRunningTime="2026-04-16 18:05:19.8863788 +0000 UTC m=+201.199058292" Apr 16 18:05:19.955639 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:19.955604 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:05:21.819896 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:21.819860 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:05:39.868320 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:39.868258 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerName="registry" containerID="cri-o://5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87" gracePeriod=30 Apr 16 18:05:40.117362 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.117336 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:05:40.159074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.158996 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159040 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159070 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159179 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9s6h\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159260 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159286 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates\") pod \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\" (UID: \"7d9b3b2d-2a26-4003-848d-306ce8d13daa\") " Apr 16 18:05:40.159594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159416 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:40.159594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159577 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-trusted-ca\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.159831 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.159800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:40.162112 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.162059 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:40.162112 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.162094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:40.162307 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.162172 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:40.162369 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.162296 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h" (OuterVolumeSpecName: "kube-api-access-m9s6h") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "kube-api-access-m9s6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:40.162651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.162620 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:40.168716 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.168693 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7d9b3b2d-2a26-4003-848d-306ce8d13daa" (UID: "7d9b3b2d-2a26-4003-848d-306ce8d13daa"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:05:40.260699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260660 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-bound-sa-token\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260693 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-installation-pull-secrets\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260704 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7d9b3b2d-2a26-4003-848d-306ce8d13daa-ca-trust-extracted\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260915 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260715 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m9s6h\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-kube-api-access-m9s6h\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260915 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260724 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7d9b3b2d-2a26-4003-848d-306ce8d13daa-image-registry-private-configuration\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260915 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260733 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-certificates\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.260915 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.260742 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7d9b3b2d-2a26-4003-848d-306ce8d13daa-registry-tls\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:40.900819 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.900785 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerID="5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87" exitCode=0 Apr 16 18:05:40.901267 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.900849 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" Apr 16 18:05:40.901267 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.900849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" event={"ID":"7d9b3b2d-2a26-4003-848d-306ce8d13daa","Type":"ContainerDied","Data":"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87"} Apr 16 18:05:40.901267 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.900950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6c869d7d6-hdnf9" event={"ID":"7d9b3b2d-2a26-4003-848d-306ce8d13daa","Type":"ContainerDied","Data":"39314a1512ff159f699f034740da0353ad26e9698e612fc041022587bb1a1bca"} Apr 16 18:05:40.901267 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.900965 2576 scope.go:117] "RemoveContainer" containerID="5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87" Apr 16 18:05:40.910562 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.910542 2576 scope.go:117] "RemoveContainer" containerID="5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87" Apr 16 18:05:40.910819 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:05:40.910801 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87\": container with ID starting with 5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87 not found: ID does not exist" containerID="5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87" Apr 16 18:05:40.910865 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.910828 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87"} err="failed to get container status \"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87\": rpc error: code = NotFound desc = could not find container \"5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87\": container with ID starting with 5682c7d7f51d9d63d109f0ddb183e7667f84f2e264e22ef4b338f7fe88772f87 not found: ID does not exist" Apr 16 18:05:40.926707 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.926681 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:05:40.936600 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:40.936575 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6c869d7d6-hdnf9"] Apr 16 18:05:41.238671 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:41.238581 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" path="/var/lib/kubelet/pods/7d9b3b2d-2a26-4003-848d-306ce8d13daa/volumes" Apr 16 18:05:44.982542 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:44.982484 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74668b85bc-2jhmb" podUID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" containerName="console" containerID="cri-o://335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21" gracePeriod=15 Apr 16 18:05:45.237073 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.237024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74668b85bc-2jhmb_fc1389cd-2c26-4408-9d07-aa77c03b4fed/console/0.log" Apr 16 18:05:45.237188 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.237079 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:45.302190 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302145 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302282 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302338 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302362 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302418 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302384 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tqmw\" (UniqueName: \"kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw\") pod \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\" (UID: \"fc1389cd-2c26-4408-9d07-aa77c03b4fed\") " Apr 16 18:05:45.302699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302660 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca" (OuterVolumeSpecName: "service-ca") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:45.302816 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302785 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config" (OuterVolumeSpecName: "console-config") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:45.302816 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.302786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:05:45.304754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.304722 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:45.304754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.304738 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw" (OuterVolumeSpecName: "kube-api-access-5tqmw") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "kube-api-access-5tqmw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:05:45.304876 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.304772 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fc1389cd-2c26-4408-9d07-aa77c03b4fed" (UID: "fc1389cd-2c26-4408-9d07-aa77c03b4fed"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:05:45.403947 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403907 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-service-ca\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.403947 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403942 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-oauth-serving-cert\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.403947 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403952 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-serving-cert\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.404183 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403961 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.404183 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403971 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fc1389cd-2c26-4408-9d07-aa77c03b4fed-console-oauth-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.404183 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.403979 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tqmw\" (UniqueName: \"kubernetes.io/projected/fc1389cd-2c26-4408-9d07-aa77c03b4fed-kube-api-access-5tqmw\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:05:45.919309 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919279 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74668b85bc-2jhmb_fc1389cd-2c26-4408-9d07-aa77c03b4fed/console/0.log" Apr 16 18:05:45.919481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919320 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" containerID="335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21" exitCode=2 Apr 16 18:05:45.919481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74668b85bc-2jhmb" event={"ID":"fc1389cd-2c26-4408-9d07-aa77c03b4fed","Type":"ContainerDied","Data":"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21"} Apr 16 18:05:45.919481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74668b85bc-2jhmb" event={"ID":"fc1389cd-2c26-4408-9d07-aa77c03b4fed","Type":"ContainerDied","Data":"d2b1473c0875bb6ac71fa309a1b55e7262d91544616b2a659f8dbf0ccd8ef232"} Apr 16 18:05:45.919481 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919419 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74668b85bc-2jhmb" Apr 16 18:05:45.919654 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.919483 2576 scope.go:117] "RemoveContainer" containerID="335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21" Apr 16 18:05:45.927994 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.927971 2576 scope.go:117] "RemoveContainer" containerID="335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21" Apr 16 18:05:45.928305 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:05:45.928284 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21\": container with ID starting with 335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21 not found: ID does not exist" containerID="335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21" Apr 16 18:05:45.928390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.928314 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21"} err="failed to get container status \"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21\": rpc error: code = NotFound desc = could not find container \"335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21\": container with ID starting with 335a9f14f8adc7014de70831b84c6ca88f22a029d37c4a630ceaef74abf9aa21 not found: ID does not exist" Apr 16 18:05:45.941826 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.941787 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:05:45.946406 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:45.946378 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74668b85bc-2jhmb"] Apr 16 18:05:47.240526 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:47.240490 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" path="/var/lib/kubelet/pods/fc1389cd-2c26-4408-9d07-aa77c03b4fed/volumes" Apr 16 18:05:51.939957 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:51.939850 2576 generic.go:358] "Generic (PLEG): container finished" podID="51ae1511-2083-4a0c-9d5a-a993e57fa083" containerID="cf6e074f319ac9eda53fc050061514264dda3562b4e613d9be6f263cd5d4477c" exitCode=0 Apr 16 18:05:51.939957 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:51.939923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" event={"ID":"51ae1511-2083-4a0c-9d5a-a993e57fa083","Type":"ContainerDied","Data":"cf6e074f319ac9eda53fc050061514264dda3562b4e613d9be6f263cd5d4477c"} Apr 16 18:05:51.940373 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:51.940335 2576 scope.go:117] "RemoveContainer" containerID="cf6e074f319ac9eda53fc050061514264dda3562b4e613d9be6f263cd5d4477c" Apr 16 18:05:52.944757 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:52.944727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qw5wn" event={"ID":"51ae1511-2083-4a0c-9d5a-a993e57fa083","Type":"ContainerStarted","Data":"6ce0389f19dd788fd2365aadd3cc481b09733926c09e2dbd467927aa54d56dc7"} Apr 16 18:05:56.958104 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:56.958021 2576 generic.go:358] "Generic (PLEG): container finished" podID="0720a490-adba-45e7-a242-c37073172c9a" containerID="1f34698826db846f0393a9c27c2918caf15e4ea2f1316c489374dc5dae7350ce" exitCode=0 Apr 16 18:05:56.958482 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:56.958097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" event={"ID":"0720a490-adba-45e7-a242-c37073172c9a","Type":"ContainerDied","Data":"1f34698826db846f0393a9c27c2918caf15e4ea2f1316c489374dc5dae7350ce"} Apr 16 18:05:56.958482 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:56.958443 2576 scope.go:117] "RemoveContainer" containerID="1f34698826db846f0393a9c27c2918caf15e4ea2f1316c489374dc5dae7350ce" Apr 16 18:05:57.963138 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:05:57.963099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-gwrcv" event={"ID":"0720a490-adba-45e7-a242-c37073172c9a","Type":"ContainerStarted","Data":"b2eb2e1f375270aa8c04e46c3097f6bd3785db7f86b7c883cdedf9cc3eb6973e"} Apr 16 18:06:06.819875 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:06.819840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:06.836989 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:06.836962 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:07.007446 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:07.007421 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:11.228560 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:11.228519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:06:11.230983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:11.230960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f342f33f-7ce1-4c45-a212-83b4c6fe1952-metrics-certs\") pod \"network-metrics-daemon-892g8\" (UID: \"f342f33f-7ce1-4c45-a212-83b4c6fe1952\") " pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:06:11.238480 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:11.238461 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2dnp6\"" Apr 16 18:06:11.245950 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:11.245934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-892g8" Apr 16 18:06:11.367349 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:11.367325 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-892g8"] Apr 16 18:06:11.369956 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:06:11.369931 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf342f33f_7ce1_4c45_a212_83b4c6fe1952.slice/crio-0f483eae1ac137e79df05723e83e60c73569ac835027aa189f3bd71cb0a609b9 WatchSource:0}: Error finding container 0f483eae1ac137e79df05723e83e60c73569ac835027aa189f3bd71cb0a609b9: Status 404 returned error can't find the container with id 0f483eae1ac137e79df05723e83e60c73569ac835027aa189f3bd71cb0a609b9 Apr 16 18:06:12.008320 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:12.008276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892g8" event={"ID":"f342f33f-7ce1-4c45-a212-83b4c6fe1952","Type":"ContainerStarted","Data":"0f483eae1ac137e79df05723e83e60c73569ac835027aa189f3bd71cb0a609b9"} Apr 16 18:06:13.013337 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:13.013292 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892g8" event={"ID":"f342f33f-7ce1-4c45-a212-83b4c6fe1952","Type":"ContainerStarted","Data":"e11b770d7f51d7fdba651e19f896d80bd7f75fb40218ffc30bca5f6749f564b6"} Apr 16 18:06:13.013337 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:13.013339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-892g8" event={"ID":"f342f33f-7ce1-4c45-a212-83b4c6fe1952","Type":"ContainerStarted","Data":"1423ac0bd7671bcbd2524324003a20cb024b36fdd16436104a8470a44fe8bb2e"} Apr 16 18:06:13.035292 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:13.035206 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-892g8" podStartSLOduration=253.022427393 podStartE2EDuration="4m14.035190855s" podCreationTimestamp="2026-04-16 18:01:59 +0000 UTC" firstStartedPulling="2026-04-16 18:06:11.371888339 +0000 UTC m=+252.684567810" lastFinishedPulling="2026-04-16 18:06:12.384651801 +0000 UTC m=+253.697331272" observedRunningTime="2026-04-16 18:06:13.033606456 +0000 UTC m=+254.346285955" watchObservedRunningTime="2026-04-16 18:06:13.035190855 +0000 UTC m=+254.347870380" Apr 16 18:06:24.934454 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934349 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:24.935032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934848 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="prometheus" containerID="cri-o://c5ac7e91ec683c72878e167707699ce9005241904235052177117bab40c44f00" gracePeriod=600 Apr 16 18:06:24.935032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934857 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-thanos" containerID="cri-o://539712fb9465ef88ffbb6d52c46c496d828fad7db140cc0186c076bd50c391e9" gracePeriod=600 Apr 16 18:06:24.935032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934914 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="config-reloader" containerID="cri-o://dcb58e4e2f1ab824ed2909e023b32a8a86440b3c901068ae80d488bf4957624e" gracePeriod=600 Apr 16 18:06:24.935032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934901 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="thanos-sidecar" containerID="cri-o://c308b3cd6872dc158c926decb751cfbcfd8d53c911ac7af798bbf00c3fdf2844" gracePeriod=600 Apr 16 18:06:24.935032 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.934865 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-web" containerID="cri-o://4255d707c5816a732eaf6d54d29fe24006579ab128e345bd72662ec70a601445" gracePeriod=600 Apr 16 18:06:24.935329 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:24.935147 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy" containerID="cri-o://70dc8de984f2e2d2c4f79caad12b9c1ce5da505043a7962d6c0f3d92582dce30" gracePeriod=600 Apr 16 18:06:25.057954 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.057920 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="539712fb9465ef88ffbb6d52c46c496d828fad7db140cc0186c076bd50c391e9" exitCode=0 Apr 16 18:06:25.057954 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.057947 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="c308b3cd6872dc158c926decb751cfbcfd8d53c911ac7af798bbf00c3fdf2844" exitCode=0 Apr 16 18:06:25.057954 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.057959 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="dcb58e4e2f1ab824ed2909e023b32a8a86440b3c901068ae80d488bf4957624e" exitCode=0 Apr 16 18:06:25.058156 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.057967 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="c5ac7e91ec683c72878e167707699ce9005241904235052177117bab40c44f00" exitCode=0 Apr 16 18:06:25.058156 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.057990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"539712fb9465ef88ffbb6d52c46c496d828fad7db140cc0186c076bd50c391e9"} Apr 16 18:06:25.058156 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.058023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"c308b3cd6872dc158c926decb751cfbcfd8d53c911ac7af798bbf00c3fdf2844"} Apr 16 18:06:25.058156 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.058034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"dcb58e4e2f1ab824ed2909e023b32a8a86440b3c901068ae80d488bf4957624e"} Apr 16 18:06:25.058156 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:25.058047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"c5ac7e91ec683c72878e167707699ce9005241904235052177117bab40c44f00"} Apr 16 18:06:26.064291 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.064253 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="70dc8de984f2e2d2c4f79caad12b9c1ce5da505043a7962d6c0f3d92582dce30" exitCode=0 Apr 16 18:06:26.064291 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.064275 2576 generic.go:358] "Generic (PLEG): container finished" podID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerID="4255d707c5816a732eaf6d54d29fe24006579ab128e345bd72662ec70a601445" exitCode=0 Apr 16 18:06:26.064705 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.064338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"70dc8de984f2e2d2c4f79caad12b9c1ce5da505043a7962d6c0f3d92582dce30"} Apr 16 18:06:26.064705 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.064370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"4255d707c5816a732eaf6d54d29fe24006579ab128e345bd72662ec70a601445"} Apr 16 18:06:26.181851 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.181825 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:26.358952 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.358927 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359121 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.358955 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359121 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.358978 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359121 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359002 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359121 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359158 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65wf\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359246 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359275 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359305 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359356 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359342 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359368 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359404 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359439 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359473 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359549 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.359647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.359581 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file\") pod \"14604c37-82f6-42cb-98de-9da3ccb24d89\" (UID: \"14604c37-82f6-42cb-98de-9da3ccb24d89\") " Apr 16 18:06:26.361961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.361917 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.362306 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.362256 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.362836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.362513 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config" (OuterVolumeSpecName: "config") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.362836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.362559 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf" (OuterVolumeSpecName: "kube-api-access-c65wf") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "kube-api-access-c65wf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:26.362836 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.362799 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.363107 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.362936 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:26.363168 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.363104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:26.363739 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.363495 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:26.363739 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.363534 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:26.363739 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.363700 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.364154 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.364020 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:26.364219 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.364201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:06:26.364671 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.364638 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.364770 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.364571 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:06:26.365805 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.365777 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.366131 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.366104 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out" (OuterVolumeSpecName: "config-out") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:06:26.366290 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.366271 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.376079 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.376049 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config" (OuterVolumeSpecName: "web-config") pod "14604c37-82f6-42cb-98de-9da3ccb24d89" (UID: "14604c37-82f6-42cb-98de-9da3ccb24d89"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:06:26.460503 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460462 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-grpc-tls\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460503 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460494 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-db\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460503 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460505 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460503 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460515 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/14604c37-82f6-42cb-98de-9da3ccb24d89-config-out\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460527 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460536 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-metrics-client-ca\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460545 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460554 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460563 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460572 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14604c37-82f6-42cb-98de-9da3ccb24d89-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460581 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460589 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460598 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460607 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-metrics-client-certs\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460615 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-tls-assets\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460623 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-web-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460632 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/14604c37-82f6-42cb-98de-9da3ccb24d89-secret-kube-rbac-proxy\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:26.460752 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:26.460640 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c65wf\" (UniqueName: \"kubernetes.io/projected/14604c37-82f6-42cb-98de-9da3ccb24d89-kube-api-access-c65wf\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:06:27.069716 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.069676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"14604c37-82f6-42cb-98de-9da3ccb24d89","Type":"ContainerDied","Data":"382458950cbd91ce122f1920580fc4f50378e74af8fcdba6aba94d5b9b11dfc5"} Apr 16 18:06:27.069716 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.069710 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.069716 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.069723 2576 scope.go:117] "RemoveContainer" containerID="539712fb9465ef88ffbb6d52c46c496d828fad7db140cc0186c076bd50c391e9" Apr 16 18:06:27.077953 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.077933 2576 scope.go:117] "RemoveContainer" containerID="70dc8de984f2e2d2c4f79caad12b9c1ce5da505043a7962d6c0f3d92582dce30" Apr 16 18:06:27.085280 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.085253 2576 scope.go:117] "RemoveContainer" containerID="4255d707c5816a732eaf6d54d29fe24006579ab128e345bd72662ec70a601445" Apr 16 18:06:27.092017 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.091992 2576 scope.go:117] "RemoveContainer" containerID="c308b3cd6872dc158c926decb751cfbcfd8d53c911ac7af798bbf00c3fdf2844" Apr 16 18:06:27.095025 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.095002 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:27.099659 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.099580 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:27.099722 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.099684 2576 scope.go:117] "RemoveContainer" containerID="dcb58e4e2f1ab824ed2909e023b32a8a86440b3c901068ae80d488bf4957624e" Apr 16 18:06:27.106629 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.106610 2576 scope.go:117] "RemoveContainer" containerID="c5ac7e91ec683c72878e167707699ce9005241904235052177117bab40c44f00" Apr 16 18:06:27.113583 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.113561 2576 scope.go:117] "RemoveContainer" containerID="ad3e063a8d22be4aa974f74182af76551aff8d02650c41121c0e7b8a72115bae" Apr 16 18:06:27.133787 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.133757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:27.134090 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134075 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy" Apr 16 18:06:27.134090 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134091 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134098 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134104 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134109 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="init-config-reloader" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134116 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="init-config-reloader" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134124 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerName="registry" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134131 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerName="registry" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134137 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="prometheus" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134142 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="prometheus" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134151 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="thanos-sidecar" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134156 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="thanos-sidecar" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134163 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-web" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134168 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-web" Apr 16 18:06:27.134167 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134176 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" containerName="console" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134181 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" containerName="console" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134190 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="config-reloader" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134195 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="config-reloader" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134263 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-web" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134273 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy-thanos" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134279 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="prometheus" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134284 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="kube-rbac-proxy" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134291 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="config-reloader" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134296 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc1389cd-2c26-4408-9d07-aa77c03b4fed" containerName="console" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134303 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d9b3b2d-2a26-4003-848d-306ce8d13daa" containerName="registry" Apr 16 18:06:27.134651 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.134308 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" containerName="thanos-sidecar" Apr 16 18:06:27.139478 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.139458 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.142613 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142588 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:06:27.142812 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:06:27.142931 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142617 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:06:27.143002 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:06:27.143002 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142621 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:06:27.143194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142695 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sqg2h\"" Apr 16 18:06:27.143194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:06:27.143194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.142651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:06:27.143357 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.143249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:06:27.143620 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.143601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eefgpp07cmo32\"" Apr 16 18:06:27.143781 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.143766 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:06:27.143906 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.143887 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:06:27.144027 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.144011 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:06:27.152547 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.152527 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:06:27.155039 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.155016 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:06:27.169897 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.169863 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:27.245185 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.245142 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14604c37-82f6-42cb-98de-9da3ccb24d89" path="/var/lib/kubelet/pods/14604c37-82f6-42cb-98de-9da3ccb24d89/volumes" Apr 16 18:06:27.265668 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.265789 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265892 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.265986 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config-out\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266045 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266218 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266060 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266218 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266218 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266218 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266218 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-web-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.266424 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.266221 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xcnw\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-kube-api-access-7xcnw\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367559 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367726 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367726 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367590 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-web-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367826 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xcnw\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-kube-api-access-7xcnw\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367826 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.367983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368087 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.367988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368087 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368087 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368087 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368118 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config-out\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368335 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368678 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.368678 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.368480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371066 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.370973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-web-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-config-out\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.371829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.371776 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.372433 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.372410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.372698 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.372682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.373983 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.373964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.374125 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.374106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.374314 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.374289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.374843 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.374824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.376712 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.376684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xcnw\" (UniqueName: \"kubernetes.io/projected/733c90a4-f31f-4e72-82b8-d8b7bc61fdd6-kube-api-access-7xcnw\") pod \"prometheus-k8s-0\" (UID: \"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.449370 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.449325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:27.584305 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:27.584280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:06:27.586732 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:06:27.586700 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733c90a4_f31f_4e72_82b8_d8b7bc61fdd6.slice/crio-444ae888990e94ff2d1c65f101848db3355174066318912506d2eb0fffb9e6fe WatchSource:0}: Error finding container 444ae888990e94ff2d1c65f101848db3355174066318912506d2eb0fffb9e6fe: Status 404 returned error can't find the container with id 444ae888990e94ff2d1c65f101848db3355174066318912506d2eb0fffb9e6fe Apr 16 18:06:28.074596 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:28.074562 2576 generic.go:358] "Generic (PLEG): container finished" podID="733c90a4-f31f-4e72-82b8-d8b7bc61fdd6" containerID="a90e2f7a1f1f0de0d3078049b86841ed0c86a7406f5e840533ddbbd4a40bd25e" exitCode=0 Apr 16 18:06:28.075075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:28.074609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerDied","Data":"a90e2f7a1f1f0de0d3078049b86841ed0c86a7406f5e840533ddbbd4a40bd25e"} Apr 16 18:06:28.075075 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:28.074629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"444ae888990e94ff2d1c65f101848db3355174066318912506d2eb0fffb9e6fe"} Apr 16 18:06:29.080816 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.080790 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:06:29.081206 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081132 2576 generic.go:358] "Generic (PLEG): container finished" podID="733c90a4-f31f-4e72-82b8-d8b7bc61fdd6" containerID="4238125c93cf02a053dde04ca1c4d99e02075a918e7b59276b9acc6e8698e051" exitCode=2 Apr 16 18:06:29.081278 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"b64b41e8e291b5932d18e38eeb56a4b2c43c636057e4e7634d9a0211289d2104"} Apr 16 18:06:29.081278 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"d4f0e53201e314c12d981d6c17432cc81b40da1cf7f54ab0f7829b19621f0b18"} Apr 16 18:06:29.081340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"5c03422b45e3e1362d7afe9bf5d42f4de519ba672c778810fd908130c573870c"} Apr 16 18:06:29.081340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"7e2b437cd8c319e1d522410ec66b8ddaef7d7a71a5d9a9e831cff7ef7babb619"} Apr 16 18:06:29.081340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"547bb150a012b5975552a6015ddd042511e227ae8e1ff8ad8c0c934cf2aef11c"} Apr 16 18:06:29.081340 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerDied","Data":"4238125c93cf02a053dde04ca1c4d99e02075a918e7b59276b9acc6e8698e051"} Apr 16 18:06:29.081613 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:29.081597 2576 scope.go:117] "RemoveContainer" containerID="4238125c93cf02a053dde04ca1c4d99e02075a918e7b59276b9acc6e8698e051" Apr 16 18:06:30.087530 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:30.087502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:06:30.087978 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:30.087880 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"733c90a4-f31f-4e72-82b8-d8b7bc61fdd6","Type":"ContainerStarted","Data":"e81759db704bb0f1391a35395007ea104df869278f414a10e2f4d327797d99e2"} Apr 16 18:06:30.124162 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:30.124104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.124088963 podStartE2EDuration="3.124088963s" podCreationTimestamp="2026-04-16 18:06:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:06:30.12127718 +0000 UTC m=+271.433956675" watchObservedRunningTime="2026-04-16 18:06:30.124088963 +0000 UTC m=+271.436768456" Apr 16 18:06:32.449633 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:32.449600 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:06:39.664720 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:39.664688 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:06:59.167866 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:59.167835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:06:59.168455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:59.167835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:06:59.179369 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:59.179338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:06:59.179687 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:59.179655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:06:59.189963 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:06:59.189944 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:07:04.685116 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:04.685046 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66bb55c95f-zcm47" podUID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" containerName="console" containerID="cri-o://6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9" gracePeriod=15 Apr 16 18:07:04.928317 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:04.928286 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66bb55c95f-zcm47_01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1/console/0.log" Apr 16 18:07:04.928442 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:04.928361 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:07:05.049425 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049329 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049425 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049372 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049425 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049402 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc5ck\" (UniqueName: \"kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049425 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049425 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049753 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049443 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049753 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049468 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049753 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049496 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca\") pod \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\" (UID: \"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1\") " Apr 16 18:07:05.049914 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049759 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:05.049971 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.049950 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config" (OuterVolumeSpecName: "console-config") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:05.050021 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.050007 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca" (OuterVolumeSpecName: "service-ca") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:05.050077 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.050006 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:07:05.051782 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.051750 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:05.051782 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.051759 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck" (OuterVolumeSpecName: "kube-api-access-fc5ck") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "kube-api-access-fc5ck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:07:05.051913 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.051874 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" (UID: "01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:07:05.150319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150282 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-oauth-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150313 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-oauth-serving-cert\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150319 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150322 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fc5ck\" (UniqueName: \"kubernetes.io/projected/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-kube-api-access-fc5ck\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150334 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-serving-cert\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150343 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-console-config\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150353 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-trusted-ca-bundle\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.150531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.150361 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1-service-ca\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:07:05.192552 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66bb55c95f-zcm47_01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1/console/0.log" Apr 16 18:07:05.192689 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192566 2576 generic.go:358] "Generic (PLEG): container finished" podID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" containerID="6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9" exitCode=2 Apr 16 18:07:05.192689 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192639 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66bb55c95f-zcm47" Apr 16 18:07:05.192757 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bb55c95f-zcm47" event={"ID":"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1","Type":"ContainerDied","Data":"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9"} Apr 16 18:07:05.192757 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66bb55c95f-zcm47" event={"ID":"01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1","Type":"ContainerDied","Data":"9d101f223c887941046f2724d0d90eb67f2355599de162346a903b22e4047338"} Apr 16 18:07:05.192757 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.192756 2576 scope.go:117] "RemoveContainer" containerID="6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9" Apr 16 18:07:05.202802 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.202782 2576 scope.go:117] "RemoveContainer" containerID="6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9" Apr 16 18:07:05.203073 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:07:05.203044 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9\": container with ID starting with 6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9 not found: ID does not exist" containerID="6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9" Apr 16 18:07:05.203136 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.203076 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9"} err="failed to get container status \"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9\": rpc error: code = NotFound desc = could not find container \"6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9\": container with ID starting with 6264748ab4f8e632c600b8fb508446ea0c9b1bda5051db9fd326efa2af06fde9 not found: ID does not exist" Apr 16 18:07:05.215918 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.215892 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:07:05.221081 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.221061 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66bb55c95f-zcm47"] Apr 16 18:07:05.238855 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:05.238828 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" path="/var/lib/kubelet/pods/01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1/volumes" Apr 16 18:07:27.450330 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:27.450279 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:07:27.466607 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:27.466579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:07:28.273953 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:07:28.273920 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:11:18.262885 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.262816 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-2t59m"] Apr 16 18:11:18.263324 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.263083 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" containerName="console" Apr 16 18:11:18.263324 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.263098 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" containerName="console" Apr 16 18:11:18.263324 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.263169 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="01bb9d3d-67b1-4ab6-ae9d-a782b377fdb1" containerName="console" Apr 16 18:11:18.266063 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.266049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2t59m" Apr 16 18:11:18.269795 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.269773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:11:18.270358 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.270340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:11:18.270448 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.270341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-vjpfs\"" Apr 16 18:11:18.270448 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.270410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:11:18.284258 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.284218 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-2t59m"] Apr 16 18:11:18.400355 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.400334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddwt\" (UniqueName: \"kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt\") pod \"s3-init-2t59m\" (UID: \"0598dfac-071d-42e6-a026-b782bba1ed5c\") " pod="kserve/s3-init-2t59m" Apr 16 18:11:18.501524 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.501502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zddwt\" (UniqueName: \"kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt\") pod \"s3-init-2t59m\" (UID: \"0598dfac-071d-42e6-a026-b782bba1ed5c\") " pod="kserve/s3-init-2t59m" Apr 16 18:11:18.511507 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.511476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddwt\" (UniqueName: \"kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt\") pod \"s3-init-2t59m\" (UID: \"0598dfac-071d-42e6-a026-b782bba1ed5c\") " pod="kserve/s3-init-2t59m" Apr 16 18:11:18.584606 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.584585 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2t59m" Apr 16 18:11:18.712434 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.712412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-2t59m"] Apr 16 18:11:18.714496 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:11:18.714465 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0598dfac_071d_42e6_a026_b782bba1ed5c.slice/crio-e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79 WatchSource:0}: Error finding container e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79: Status 404 returned error can't find the container with id e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79 Apr 16 18:11:18.716103 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.716089 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:11:18.894762 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:18.894686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2t59m" event={"ID":"0598dfac-071d-42e6-a026-b782bba1ed5c","Type":"ContainerStarted","Data":"e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79"} Apr 16 18:11:23.911679 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:23.911639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2t59m" event={"ID":"0598dfac-071d-42e6-a026-b782bba1ed5c","Type":"ContainerStarted","Data":"8005a582674315af9c80065e109bc5ed77988a4c4f0d314ee62d445ea2863f0e"} Apr 16 18:11:26.920829 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:26.920793 2576 generic.go:358] "Generic (PLEG): container finished" podID="0598dfac-071d-42e6-a026-b782bba1ed5c" containerID="8005a582674315af9c80065e109bc5ed77988a4c4f0d314ee62d445ea2863f0e" exitCode=0 Apr 16 18:11:26.921198 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:26.920868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2t59m" event={"ID":"0598dfac-071d-42e6-a026-b782bba1ed5c","Type":"ContainerDied","Data":"8005a582674315af9c80065e109bc5ed77988a4c4f0d314ee62d445ea2863f0e"} Apr 16 18:11:28.047578 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.047552 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2t59m" Apr 16 18:11:28.182942 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.182854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zddwt\" (UniqueName: \"kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt\") pod \"0598dfac-071d-42e6-a026-b782bba1ed5c\" (UID: \"0598dfac-071d-42e6-a026-b782bba1ed5c\") " Apr 16 18:11:28.185283 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.185223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt" (OuterVolumeSpecName: "kube-api-access-zddwt") pod "0598dfac-071d-42e6-a026-b782bba1ed5c" (UID: "0598dfac-071d-42e6-a026-b782bba1ed5c"). InnerVolumeSpecName "kube-api-access-zddwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:11:28.284360 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.284317 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zddwt\" (UniqueName: \"kubernetes.io/projected/0598dfac-071d-42e6-a026-b782bba1ed5c-kube-api-access-zddwt\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:11:28.927885 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.927844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-2t59m" event={"ID":"0598dfac-071d-42e6-a026-b782bba1ed5c","Type":"ContainerDied","Data":"e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79"} Apr 16 18:11:28.927885 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.927874 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-2t59m" Apr 16 18:11:28.928096 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:28.927878 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e76faf89c67e0566a3594001f0e98f4934dc228c60b143d41c8eee504bb83f79" Apr 16 18:11:31.619791 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.619758 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f465495f5-lctm8"] Apr 16 18:11:31.620178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.620048 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0598dfac-071d-42e6-a026-b782bba1ed5c" containerName="s3-init" Apr 16 18:11:31.620178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.620064 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0598dfac-071d-42e6-a026-b782bba1ed5c" containerName="s3-init" Apr 16 18:11:31.620178 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.620170 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0598dfac-071d-42e6-a026-b782bba1ed5c" containerName="s3-init" Apr 16 18:11:31.623284 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.623267 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.626851 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.626828 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-87wwb\"" Apr 16 18:11:31.627649 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.627634 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:11:31.627707 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.627691 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:11:31.628318 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.628301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:11:31.628394 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.628380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:11:31.628778 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.628762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:11:31.637029 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.637010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:11:31.645489 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.645466 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f465495f5-lctm8"] Apr 16 18:11:31.714804 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.714769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-trusted-ca-bundle\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.714804 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.714807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-service-ca\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.715030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.714825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.715030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.714907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-oauth-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.715030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.714936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.715030 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.715015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-oauth-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.715194 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.715056 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcgcq\" (UniqueName: \"kubernetes.io/projected/a9b4d73a-64ad-41cf-9499-50e80567dca0-kube-api-access-zcgcq\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816250 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-oauth-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816411 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816411 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-oauth-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816411 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcgcq\" (UniqueName: \"kubernetes.io/projected/a9b4d73a-64ad-41cf-9499-50e80567dca0-kube-api-access-zcgcq\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816411 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-trusted-ca-bundle\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816629 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-service-ca\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.816629 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.816447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.817149 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.817115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-service-ca\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.817333 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.817216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.817333 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.817317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-oauth-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.817508 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.817344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b4d73a-64ad-41cf-9499-50e80567dca0-trusted-ca-bundle\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.818876 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.818855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-oauth-config\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.819098 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.818938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b4d73a-64ad-41cf-9499-50e80567dca0-console-serving-cert\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.827545 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.827522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcgcq\" (UniqueName: \"kubernetes.io/projected/a9b4d73a-64ad-41cf-9499-50e80567dca0-kube-api-access-zcgcq\") pod \"console-6f465495f5-lctm8\" (UID: \"a9b4d73a-64ad-41cf-9499-50e80567dca0\") " pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:31.933649 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:31.933574 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:32.060665 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:32.060641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f465495f5-lctm8"] Apr 16 18:11:32.062118 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:11:32.062083 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9b4d73a_64ad_41cf_9499_50e80567dca0.slice/crio-52a73447849d112caa80c5a91c4350bbe9f2ebbdc6ebea16e6cbda4f4fe8404d WatchSource:0}: Error finding container 52a73447849d112caa80c5a91c4350bbe9f2ebbdc6ebea16e6cbda4f4fe8404d: Status 404 returned error can't find the container with id 52a73447849d112caa80c5a91c4350bbe9f2ebbdc6ebea16e6cbda4f4fe8404d Apr 16 18:11:32.941921 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:32.941876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f465495f5-lctm8" event={"ID":"a9b4d73a-64ad-41cf-9499-50e80567dca0","Type":"ContainerStarted","Data":"cea2ca42bdd8983f1f6e2413d29ecce520083bea0258ffd90f2a27085b440349"} Apr 16 18:11:32.941921 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:32.941918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f465495f5-lctm8" event={"ID":"a9b4d73a-64ad-41cf-9499-50e80567dca0","Type":"ContainerStarted","Data":"52a73447849d112caa80c5a91c4350bbe9f2ebbdc6ebea16e6cbda4f4fe8404d"} Apr 16 18:11:41.934219 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:41.934184 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:41.934219 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:41.934245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:41.938772 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:41.938745 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:41.959926 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:41.959888 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f465495f5-lctm8" podStartSLOduration=10.959877642 podStartE2EDuration="10.959877642s" podCreationTimestamp="2026-04-16 18:11:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:11:32.972021469 +0000 UTC m=+574.284701131" watchObservedRunningTime="2026-04-16 18:11:41.959877642 +0000 UTC m=+583.272557135" Apr 16 18:11:41.970665 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:41.970643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f465495f5-lctm8" Apr 16 18:11:59.198746 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:59.198718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:11:59.199619 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:59.199598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:11:59.208320 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:59.208293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:11:59.209585 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:11:59.209556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:16:24.758346 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.758309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:16:24.760491 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.760476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:16:24.763513 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.763488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hlvv7\"" Apr 16 18:16:24.769754 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.769735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:16:24.772482 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.772457 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:16:24.899601 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.899534 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:16:24.902200 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:16:24.902171 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd27bae24_1e37_49ad_9a45_19443ac5b68f.slice/crio-e62076546cc4fb9e930dbfae8177b4f1f88564579d41131ddd008a9f8b362406 WatchSource:0}: Error finding container e62076546cc4fb9e930dbfae8177b4f1f88564579d41131ddd008a9f8b362406: Status 404 returned error can't find the container with id e62076546cc4fb9e930dbfae8177b4f1f88564579d41131ddd008a9f8b362406 Apr 16 18:16:24.904072 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:24.904055 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:16:25.762831 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:25.762799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" event={"ID":"d27bae24-1e37-49ad-9a45-19443ac5b68f","Type":"ContainerStarted","Data":"e62076546cc4fb9e930dbfae8177b4f1f88564579d41131ddd008a9f8b362406"} Apr 16 18:16:26.767482 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:26.767445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" event={"ID":"d27bae24-1e37-49ad-9a45-19443ac5b68f","Type":"ContainerStarted","Data":"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27"} Apr 16 18:16:26.767920 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:26.767595 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:16:26.769386 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:26.769365 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:16:26.783241 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:26.783173 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" podStartSLOduration=1.7751108549999999 podStartE2EDuration="2.78315733s" podCreationTimestamp="2026-04-16 18:16:24 +0000 UTC" firstStartedPulling="2026-04-16 18:16:24.904201337 +0000 UTC m=+866.216880810" lastFinishedPulling="2026-04-16 18:16:25.912247813 +0000 UTC m=+867.224927285" observedRunningTime="2026-04-16 18:16:26.782689554 +0000 UTC m=+868.095369047" watchObservedRunningTime="2026-04-16 18:16:26.78315733 +0000 UTC m=+868.095836822" Apr 16 18:16:59.222444 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:59.222372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:16:59.224748 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:59.224724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:16:59.232299 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:59.232280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:16:59.234415 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:16:59.234396 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:17:49.803655 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:49.803622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-28c03-predictor-68cfff879d-vwr8t_d27bae24-1e37-49ad-9a45-19443ac5b68f/kserve-container/0.log" Apr 16 18:17:50.090523 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:50.090445 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:17:50.090712 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:50.090676 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" podUID="d27bae24-1e37-49ad-9a45-19443ac5b68f" containerName="kserve-container" containerID="cri-o://ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27" gracePeriod=30 Apr 16 18:17:50.332125 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:50.332097 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:17:51.008801 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.008767 2576 generic.go:358] "Generic (PLEG): container finished" podID="d27bae24-1e37-49ad-9a45-19443ac5b68f" containerID="ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27" exitCode=2 Apr 16 18:17:51.009271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.008858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" event={"ID":"d27bae24-1e37-49ad-9a45-19443ac5b68f","Type":"ContainerDied","Data":"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27"} Apr 16 18:17:51.009271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.008884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" event={"ID":"d27bae24-1e37-49ad-9a45-19443ac5b68f","Type":"ContainerDied","Data":"e62076546cc4fb9e930dbfae8177b4f1f88564579d41131ddd008a9f8b362406"} Apr 16 18:17:51.009271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.008898 2576 scope.go:117] "RemoveContainer" containerID="ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27" Apr 16 18:17:51.009271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.008906 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t" Apr 16 18:17:51.017047 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.017031 2576 scope.go:117] "RemoveContainer" containerID="ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27" Apr 16 18:17:51.017327 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:17:51.017304 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27\": container with ID starting with ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27 not found: ID does not exist" containerID="ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27" Apr 16 18:17:51.017387 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.017341 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27"} err="failed to get container status \"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27\": rpc error: code = NotFound desc = could not find container \"ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27\": container with ID starting with ae2c45ba7b4cb997a454f9f0f6cb1e9954bd755ab6c80a32bc9d2dbce4355a27 not found: ID does not exist" Apr 16 18:17:51.032964 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.032935 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:17:51.037271 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.037250 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-28c03-predictor-68cfff879d-vwr8t"] Apr 16 18:17:51.238269 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:17:51.238211 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d27bae24-1e37-49ad-9a45-19443ac5b68f" path="/var/lib/kubelet/pods/d27bae24-1e37-49ad-9a45-19443ac5b68f/volumes" Apr 16 18:21:59.243841 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:21:59.243816 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:21:59.248638 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:21:59.248614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:21:59.253264 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:21:59.253219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:21:59.257850 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:21:59.257834 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:24:01.265678 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.265642 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-69rv9/must-gather-2vn7v"] Apr 16 18:24:01.266124 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.265913 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d27bae24-1e37-49ad-9a45-19443ac5b68f" containerName="kserve-container" Apr 16 18:24:01.266124 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.265923 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27bae24-1e37-49ad-9a45-19443ac5b68f" containerName="kserve-container" Apr 16 18:24:01.266124 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.265984 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d27bae24-1e37-49ad-9a45-19443ac5b68f" containerName="kserve-container" Apr 16 18:24:01.268401 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.268385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.271125 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.271105 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-69rv9\"/\"openshift-service-ca.crt\"" Apr 16 18:24:01.271265 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.271147 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-69rv9\"/\"default-dockercfg-96vfr\"" Apr 16 18:24:01.271775 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.271758 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-69rv9\"/\"kube-root-ca.crt\"" Apr 16 18:24:01.283981 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.283960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-69rv9/must-gather-2vn7v"] Apr 16 18:24:01.388111 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.388078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.388111 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.388123 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674bk\" (UniqueName: \"kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.489435 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.489394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.489435 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.489439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-674bk\" (UniqueName: \"kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.489750 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.489729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.500788 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.500763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-674bk\" (UniqueName: \"kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk\") pod \"must-gather-2vn7v\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.586358 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.586330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:01.710744 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.710717 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-69rv9/must-gather-2vn7v"] Apr 16 18:24:01.712127 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:24:01.712097 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod501e7390_80ab_4571_852b_768c1edc1943.slice/crio-75a1094a1e0bacf82263524bc6aeded2305c807eb098cd9b678a515e8e77c217 WatchSource:0}: Error finding container 75a1094a1e0bacf82263524bc6aeded2305c807eb098cd9b678a515e8e77c217: Status 404 returned error can't find the container with id 75a1094a1e0bacf82263524bc6aeded2305c807eb098cd9b678a515e8e77c217 Apr 16 18:24:01.713776 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:01.713758 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:24:02.079292 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:02.079258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69rv9/must-gather-2vn7v" event={"ID":"501e7390-80ab-4571-852b-768c1edc1943","Type":"ContainerStarted","Data":"75a1094a1e0bacf82263524bc6aeded2305c807eb098cd9b678a515e8e77c217"} Apr 16 18:24:07.100390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:07.100349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69rv9/must-gather-2vn7v" event={"ID":"501e7390-80ab-4571-852b-768c1edc1943","Type":"ContainerStarted","Data":"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8"} Apr 16 18:24:07.100390 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:07.100395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69rv9/must-gather-2vn7v" event={"ID":"501e7390-80ab-4571-852b-768c1edc1943","Type":"ContainerStarted","Data":"0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd"} Apr 16 18:24:07.119949 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:07.119893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-69rv9/must-gather-2vn7v" podStartSLOduration=1.856427679 podStartE2EDuration="6.119878773s" podCreationTimestamp="2026-04-16 18:24:01 +0000 UTC" firstStartedPulling="2026-04-16 18:24:01.713877793 +0000 UTC m=+1323.026557264" lastFinishedPulling="2026-04-16 18:24:05.977328887 +0000 UTC m=+1327.290008358" observedRunningTime="2026-04-16 18:24:07.117493687 +0000 UTC m=+1328.430173218" watchObservedRunningTime="2026-04-16 18:24:07.119878773 +0000 UTC m=+1328.432558266" Apr 16 18:24:24.162174 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:24.162087 2576 generic.go:358] "Generic (PLEG): container finished" podID="501e7390-80ab-4571-852b-768c1edc1943" containerID="110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8" exitCode=0 Apr 16 18:24:24.162174 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:24.162162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-69rv9/must-gather-2vn7v" event={"ID":"501e7390-80ab-4571-852b-768c1edc1943","Type":"ContainerDied","Data":"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8"} Apr 16 18:24:24.162573 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:24.162472 2576 scope.go:117] "RemoveContainer" containerID="110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8" Apr 16 18:24:24.913699 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:24.913649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69rv9_must-gather-2vn7v_501e7390-80ab-4571-852b-768c1edc1943/gather/0.log" Apr 16 18:24:25.418205 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.418176 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4f7nz/must-gather-xqm6r"] Apr 16 18:24:25.420606 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.420592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.423094 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.423071 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"openshift-service-ca.crt\"" Apr 16 18:24:25.423368 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.423343 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4f7nz\"/\"default-dockercfg-h5b95\"" Apr 16 18:24:25.423486 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.423408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4f7nz\"/\"kube-root-ca.crt\"" Apr 16 18:24:25.436663 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.432437 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/must-gather-xqm6r"] Apr 16 18:24:25.487757 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.487732 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdftm\" (UniqueName: \"kubernetes.io/projected/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-kube-api-access-tdftm\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.487867 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.487792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-must-gather-output\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.588372 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.588346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdftm\" (UniqueName: \"kubernetes.io/projected/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-kube-api-access-tdftm\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.588456 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.588399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-must-gather-output\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.588654 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.588640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-must-gather-output\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.597160 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.597136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdftm\" (UniqueName: \"kubernetes.io/projected/ca093a41-1f4b-4df6-8d1f-5383a9a9c47c-kube-api-access-tdftm\") pod \"must-gather-xqm6r\" (UID: \"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c\") " pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.729634 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.729564 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" Apr 16 18:24:25.849369 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:25.849014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/must-gather-xqm6r"] Apr 16 18:24:25.851154 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:24:25.851125 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca093a41_1f4b_4df6_8d1f_5383a9a9c47c.slice/crio-99e3f789562197f27ee1071dc894b33f11692d513c516d7dd218e71f6501c5d4 WatchSource:0}: Error finding container 99e3f789562197f27ee1071dc894b33f11692d513c516d7dd218e71f6501c5d4: Status 404 returned error can't find the container with id 99e3f789562197f27ee1071dc894b33f11692d513c516d7dd218e71f6501c5d4 Apr 16 18:24:26.167717 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:26.167685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" event={"ID":"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c","Type":"ContainerStarted","Data":"99e3f789562197f27ee1071dc894b33f11692d513c516d7dd218e71f6501c5d4"} Apr 16 18:24:27.173972 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:27.173456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" event={"ID":"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c","Type":"ContainerStarted","Data":"4f304a55d9eb6da95753a11be5ef98e25b41afb713c2a76d0fb10cd2c66383c6"} Apr 16 18:24:27.173972 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:27.173501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" event={"ID":"ca093a41-1f4b-4df6-8d1f-5383a9a9c47c","Type":"ContainerStarted","Data":"0313b22ad8166c94467e28ab62b7e557993e22d080006f852f98747ca11e1b9e"} Apr 16 18:24:27.192790 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:27.192395 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4f7nz/must-gather-xqm6r" podStartSLOduration=1.394501269 podStartE2EDuration="2.192375479s" podCreationTimestamp="2026-04-16 18:24:25 +0000 UTC" firstStartedPulling="2026-04-16 18:24:25.852895441 +0000 UTC m=+1347.165574915" lastFinishedPulling="2026-04-16 18:24:26.650769654 +0000 UTC m=+1347.963449125" observedRunningTime="2026-04-16 18:24:27.189660563 +0000 UTC m=+1348.502340057" watchObservedRunningTime="2026-04-16 18:24:27.192375479 +0000 UTC m=+1348.505054973" Apr 16 18:24:28.111876 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:28.111839 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-brlng_734a1c0e-a532-48d0-9ded-1550c1e4391c/global-pull-secret-syncer/0.log" Apr 16 18:24:28.253412 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:28.253383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nj79n_bd40f763-9175-4f8d-850e-89f05f5ff1b8/konnectivity-agent/0.log" Apr 16 18:24:28.334369 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:28.334340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-213.ec2.internal_51f413f04a3f9e056bc3cfdb194d79d1/haproxy/0.log" Apr 16 18:24:30.262999 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.262962 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-69rv9/must-gather-2vn7v"] Apr 16 18:24:30.263912 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.263865 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-69rv9/must-gather-2vn7v" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="copy" containerID="cri-o://0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd" gracePeriod=2 Apr 16 18:24:30.266756 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.266713 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-69rv9/must-gather-2vn7v"] Apr 16 18:24:30.267153 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.267104 2576 status_manager.go:895] "Failed to get status for pod" podUID="501e7390-80ab-4571-852b-768c1edc1943" pod="openshift-must-gather-69rv9/must-gather-2vn7v" err="pods \"must-gather-2vn7v\" is forbidden: User \"system:node:ip-10-0-137-213.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-69rv9\": no relationship found between node 'ip-10-0-137-213.ec2.internal' and this object" Apr 16 18:24:30.669255 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.664624 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69rv9_must-gather-2vn7v_501e7390-80ab-4571-852b-768c1edc1943/copy/0.log" Apr 16 18:24:30.669255 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.665031 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:30.669586 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.669405 2576 status_manager.go:895] "Failed to get status for pod" podUID="501e7390-80ab-4571-852b-768c1edc1943" pod="openshift-must-gather-69rv9/must-gather-2vn7v" err="pods \"must-gather-2vn7v\" is forbidden: User \"system:node:ip-10-0-137-213.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-69rv9\": no relationship found between node 'ip-10-0-137-213.ec2.internal' and this object" Apr 16 18:24:30.740887 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.740848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674bk\" (UniqueName: \"kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk\") pod \"501e7390-80ab-4571-852b-768c1edc1943\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " Apr 16 18:24:30.741172 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.740923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output\") pod \"501e7390-80ab-4571-852b-768c1edc1943\" (UID: \"501e7390-80ab-4571-852b-768c1edc1943\") " Apr 16 18:24:30.742368 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.742330 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "501e7390-80ab-4571-852b-768c1edc1943" (UID: "501e7390-80ab-4571-852b-768c1edc1943"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:24:30.751489 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.751451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk" (OuterVolumeSpecName: "kube-api-access-674bk") pod "501e7390-80ab-4571-852b-768c1edc1943" (UID: "501e7390-80ab-4571-852b-768c1edc1943"). InnerVolumeSpecName "kube-api-access-674bk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:24:30.842382 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.842341 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-674bk\" (UniqueName: \"kubernetes.io/projected/501e7390-80ab-4571-852b-768c1edc1943-kube-api-access-674bk\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:24:30.842382 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:30.842382 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/501e7390-80ab-4571-852b-768c1edc1943-must-gather-output\") on node \"ip-10-0-137-213.ec2.internal\" DevicePath \"\"" Apr 16 18:24:31.192173 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.192091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-69rv9_must-gather-2vn7v_501e7390-80ab-4571-852b-768c1edc1943/copy/0.log" Apr 16 18:24:31.192688 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.192655 2576 generic.go:358] "Generic (PLEG): container finished" podID="501e7390-80ab-4571-852b-768c1edc1943" containerID="0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd" exitCode=143 Apr 16 18:24:31.192799 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.192739 2576 scope.go:117] "RemoveContainer" containerID="0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd" Apr 16 18:24:31.192920 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.192902 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-69rv9/must-gather-2vn7v" Apr 16 18:24:31.196531 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.196499 2576 status_manager.go:895] "Failed to get status for pod" podUID="501e7390-80ab-4571-852b-768c1edc1943" pod="openshift-must-gather-69rv9/must-gather-2vn7v" err="pods \"must-gather-2vn7v\" is forbidden: User \"system:node:ip-10-0-137-213.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-69rv9\": no relationship found between node 'ip-10-0-137-213.ec2.internal' and this object" Apr 16 18:24:31.209495 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.208814 2576 scope.go:117] "RemoveContainer" containerID="110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8" Apr 16 18:24:31.212830 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.212795 2576 status_manager.go:895] "Failed to get status for pod" podUID="501e7390-80ab-4571-852b-768c1edc1943" pod="openshift-must-gather-69rv9/must-gather-2vn7v" err="pods \"must-gather-2vn7v\" is forbidden: User \"system:node:ip-10-0-137-213.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-69rv9\": no relationship found between node 'ip-10-0-137-213.ec2.internal' and this object" Apr 16 18:24:31.230566 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.230265 2576 scope.go:117] "RemoveContainer" containerID="0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd" Apr 16 18:24:31.231058 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:24:31.230787 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd\": container with ID starting with 0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd not found: ID does not exist" containerID="0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd" Apr 16 18:24:31.231058 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.230826 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd"} err="failed to get container status \"0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd\": rpc error: code = NotFound desc = could not find container \"0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd\": container with ID starting with 0791d1ecbd5ff23b48719ca59f13c5ffa518509621bc2fe55bbba89604edf7bd not found: ID does not exist" Apr 16 18:24:31.231058 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.230854 2576 scope.go:117] "RemoveContainer" containerID="110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8" Apr 16 18:24:31.231447 ip-10-0-137-213 kubenswrapper[2576]: E0416 18:24:31.231350 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8\": container with ID starting with 110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8 not found: ID does not exist" containerID="110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8" Apr 16 18:24:31.231447 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.231392 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8"} err="failed to get container status \"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8\": rpc error: code = NotFound desc = could not find container \"110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8\": container with ID starting with 110bb4203ce0e070e60d06fd6c2ccd85f2a4df94f576729420a7cf276a8601c8 not found: ID does not exist" Apr 16 18:24:31.241515 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:31.241493 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501e7390-80ab-4571-852b-768c1edc1943" path="/var/lib/kubelet/pods/501e7390-80ab-4571-852b-768c1edc1943/volumes" Apr 16 18:24:32.371839 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.371783 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-92twm_95273443-3fd0-40e6-8ba2-20dd9bafad0a/monitoring-plugin/0.log" Apr 16 18:24:32.568512 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.568489 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xp882_ad872cd2-8712-402a-9d8d-4742f86d316b/node-exporter/0.log" Apr 16 18:24:32.592741 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.592710 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xp882_ad872cd2-8712-402a-9d8d-4742f86d316b/kube-rbac-proxy/0.log" Apr 16 18:24:32.618205 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.618097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xp882_ad872cd2-8712-402a-9d8d-4742f86d316b/init-textfile/0.log" Apr 16 18:24:32.736894 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.736812 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/0.log" Apr 16 18:24:32.743031 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.743000 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/prometheus/1.log" Apr 16 18:24:32.763557 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.763529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/config-reloader/0.log" Apr 16 18:24:32.787734 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.787706 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/thanos-sidecar/0.log" Apr 16 18:24:32.811397 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.811369 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/kube-rbac-proxy-web/0.log" Apr 16 18:24:32.835246 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.835198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/kube-rbac-proxy/0.log" Apr 16 18:24:32.860306 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.860282 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/kube-rbac-proxy-thanos/0.log" Apr 16 18:24:32.887472 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.887443 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_733c90a4-f31f-4e72-82b8-d8b7bc61fdd6/init-config-reloader/0.log" Apr 16 18:24:32.917771 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.917744 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-5vqgz_a465148d-0ff9-4c68-b379-ed74bdf8a280/prometheus-operator/0.log" Apr 16 18:24:32.953702 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.953662 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-5vqgz_a465148d-0ff9-4c68-b379-ed74bdf8a280/kube-rbac-proxy/0.log" Apr 16 18:24:32.985885 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:32.985846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-fxd9n_6f516ce2-9a74-4204-b836-c2af2e82507e/prometheus-operator-admission-webhook/0.log" Apr 16 18:24:34.920031 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.919996 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc"] Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920444 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="gather" Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920457 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="gather" Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920466 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="copy" Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920474 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="copy" Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920550 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="gather" Apr 16 18:24:34.920594 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.920565 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="501e7390-80ab-4571-852b-768c1edc1943" containerName="copy" Apr 16 18:24:34.923799 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.923776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:34.932016 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.931980 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc"] Apr 16 18:24:34.980104 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.980072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrk4\" (UniqueName: \"kubernetes.io/projected/1608bcb3-75f5-453c-a568-52e231315355-kube-api-access-sqrk4\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:34.980296 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.980149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-proc\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:34.980296 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.980182 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-podres\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:34.980423 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.980293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-lib-modules\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:34.980423 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:34.980361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-sys\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.081755 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-lib-modules\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.081932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-sys\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.081932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrk4\" (UniqueName: \"kubernetes.io/projected/1608bcb3-75f5-453c-a568-52e231315355-kube-api-access-sqrk4\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.081932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-proc\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.081932 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-lib-modules\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.082142 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-sys\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.082142 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.081936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-podres\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.082142 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.082024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-podres\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.082142 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.082041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1608bcb3-75f5-453c-a568-52e231315355-proc\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.090560 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.090535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrk4\" (UniqueName: \"kubernetes.io/projected/1608bcb3-75f5-453c-a568-52e231315355-kube-api-access-sqrk4\") pod \"perf-node-gather-daemonset-6khcc\" (UID: \"1608bcb3-75f5-453c-a568-52e231315355\") " pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.097461 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.097439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f465495f5-lctm8_a9b4d73a-64ad-41cf-9499-50e80567dca0/console/0.log" Apr 16 18:24:35.131714 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.131681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-smccb_939eb83f-31cd-4a31-9e2b-68288a7cbf8d/download-server/0.log" Apr 16 18:24:35.238295 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.238200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:35.365894 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.365866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc"] Apr 16 18:24:35.368433 ip-10-0-137-213 kubenswrapper[2576]: W0416 18:24:35.368404 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1608bcb3_75f5_453c_a568_52e231315355.slice/crio-bde9349d86e9d0f84a1f5e0ee8585e88c35e631514f89381ac33efa342782b01 WatchSource:0}: Error finding container bde9349d86e9d0f84a1f5e0ee8585e88c35e631514f89381ac33efa342782b01: Status 404 returned error can't find the container with id bde9349d86e9d0f84a1f5e0ee8585e88c35e631514f89381ac33efa342782b01 Apr 16 18:24:35.552209 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:35.552178 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-qzg95_f337f655-cd92-41f3-a722-832193387a64/volume-data-source-validator/0.log" Apr 16 18:24:36.211582 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.211544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" event={"ID":"1608bcb3-75f5-453c-a568-52e231315355","Type":"ContainerStarted","Data":"bcc3540a7794b5c59e06a34b4abfe174da672f52d045634410e249484c932cf9"} Apr 16 18:24:36.211582 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.211584 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" event={"ID":"1608bcb3-75f5-453c-a568-52e231315355","Type":"ContainerStarted","Data":"bde9349d86e9d0f84a1f5e0ee8585e88c35e631514f89381ac33efa342782b01"} Apr 16 18:24:36.211996 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.211612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:36.227183 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.227136 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" podStartSLOduration=2.227121453 podStartE2EDuration="2.227121453s" podCreationTimestamp="2026-04-16 18:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:24:36.225717886 +0000 UTC m=+1357.538397379" watchObservedRunningTime="2026-04-16 18:24:36.227121453 +0000 UTC m=+1357.539800945" Apr 16 18:24:36.243591 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.243566 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j6hg6_7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf/dns/0.log" Apr 16 18:24:36.265024 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.264993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j6hg6_7bb34a32-b49e-4b16-85ae-ffdf01b7c5bf/kube-rbac-proxy/0.log" Apr 16 18:24:36.369508 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.369490 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-q59mj_798435e6-adbf-486f-bd1a-ba36ade6c8d3/dns-node-resolver/0.log" Apr 16 18:24:36.849943 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:36.849918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vzsgf_446d8c35-b0da-42e5-a071-ea17b9747bb2/node-ca/0.log" Apr 16 18:24:37.626094 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:37.626062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58fbdb974b-w74b9_4db8d207-3ea1-45c0-91a6-c173825a77bb/router/0.log" Apr 16 18:24:37.958680 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:37.958594 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4zjt2_cec75d81-47b0-42a8-b1a3-27ed663fc255/serve-healthcheck-canary/0.log" Apr 16 18:24:38.374292 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:38.374260 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-qw5wn_51ae1511-2083-4a0c-9d5a-a993e57fa083/insights-operator/1.log" Apr 16 18:24:38.374532 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:38.374511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-qw5wn_51ae1511-2083-4a0c-9d5a-a993e57fa083/insights-operator/0.log" Apr 16 18:24:38.546647 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:38.546620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tzzx2_7074862a-ed98-49ff-8632-fc350bb47fe1/kube-rbac-proxy/0.log" Apr 16 18:24:38.568567 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:38.568539 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tzzx2_7074862a-ed98-49ff-8632-fc350bb47fe1/exporter/0.log" Apr 16 18:24:38.592501 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:38.592472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tzzx2_7074862a-ed98-49ff-8632-fc350bb47fe1/extractor/0.log" Apr 16 18:24:40.678074 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:40.678045 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-2t59m_0598dfac-071d-42e6-a026-b782bba1ed5c/s3-init/0.log" Apr 16 18:24:42.225455 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:42.225430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4f7nz/perf-node-gather-daemonset-6khcc" Apr 16 18:24:44.971496 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:44.971416 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-p5x8q_7f544ce1-56ea-420f-b0f3-067278e84ad0/migrator/0.log" Apr 16 18:24:44.995000 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:44.994970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-p5x8q_7f544ce1-56ea-420f-b0f3-067278e84ad0/graceful-termination/0.log" Apr 16 18:24:45.314489 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:45.314462 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-gwrcv_0720a490-adba-45e7-a242-c37073172c9a/kube-storage-version-migrator-operator/1.log" Apr 16 18:24:45.315795 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:45.315764 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-gwrcv_0720a490-adba-45e7-a242-c37073172c9a/kube-storage-version-migrator-operator/0.log" Apr 16 18:24:46.487961 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.487935 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/kube-multus-additional-cni-plugins/0.log" Apr 16 18:24:46.511843 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.511759 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/egress-router-binary-copy/0.log" Apr 16 18:24:46.534334 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.534300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/cni-plugins/0.log" Apr 16 18:24:46.556127 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.556103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/bond-cni-plugin/0.log" Apr 16 18:24:46.578747 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.578711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/routeoverride-cni/0.log" Apr 16 18:24:46.619676 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.619653 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/whereabouts-cni-bincopy/0.log" Apr 16 18:24:46.646401 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.646373 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gnj8w_b9e0fd91-9b40-48e9-87ac-be0b97367fc5/whereabouts-cni/0.log" Apr 16 18:24:46.855528 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:46.855500 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dh898_a787fe1a-b0bd-4485-b5c9-a196d280f7c1/kube-multus/0.log" Apr 16 18:24:47.007015 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:47.006936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-892g8_f342f33f-7ce1-4c45-a212-83b4c6fe1952/network-metrics-daemon/0.log" Apr 16 18:24:47.046856 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:47.046830 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-892g8_f342f33f-7ce1-4c45-a212-83b4c6fe1952/kube-rbac-proxy/0.log" Apr 16 18:24:48.005654 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.005626 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-controller/0.log" Apr 16 18:24:48.029778 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.029748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/0.log" Apr 16 18:24:48.043988 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.043947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovn-acl-logging/1.log" Apr 16 18:24:48.072182 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.072162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/kube-rbac-proxy-node/0.log" Apr 16 18:24:48.100813 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.100792 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:24:48.127467 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.127451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/northd/0.log" Apr 16 18:24:48.156169 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.156147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/nbdb/0.log" Apr 16 18:24:48.179637 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.179608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/sbdb/0.log" Apr 16 18:24:48.390506 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:48.390468 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-k7zg2_d16ce647-f47f-4f7b-9607-e47c6d4e67ce/ovnkube-controller/0.log" Apr 16 18:24:50.013997 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:50.013967 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ghw9q_310f5d23-e68e-46b7-808d-ca6cb602e572/network-check-target-container/0.log" Apr 16 18:24:50.995703 ip-10-0-137-213 kubenswrapper[2576]: I0416 18:24:50.995668 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jql59_1934f30b-41be-47ce-b2a7-9accbed71976/iptables-alerter/0.log"