Apr 20 17:48:28.776962 ip-10-0-137-82 systemd[1]: Starting Kubernetes Kubelet... Apr 20 17:48:29.266947 ip-10-0-137-82 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:29.266947 ip-10-0-137-82 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 17:48:29.266947 ip-10-0-137-82 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:29.266947 ip-10-0-137-82 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 17:48:29.266947 ip-10-0-137-82 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 17:48:29.269480 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.269369 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 17:48:29.272345 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272331 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:29.272345 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272345 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272349 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272352 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272355 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272359 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272362 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272365 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272368 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272371 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272374 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272377 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272380 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272392 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272395 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272398 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272400 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272403 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272406 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272408 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:29.272408 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272411 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272415 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272418 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272421 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272424 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272426 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272429 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272431 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272434 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272436 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272439 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272442 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272444 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272447 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272449 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272452 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272454 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272457 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272459 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272462 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:29.272874 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272464 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272467 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272470 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272472 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272475 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272477 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272480 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272482 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272485 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272487 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272490 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272492 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272495 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272498 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272502 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272505 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272507 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272510 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272512 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272515 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:29.273380 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272517 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272522 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272525 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272529 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272532 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272535 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272538 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272540 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272543 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272545 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272548 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272551 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272553 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272556 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272558 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272561 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272565 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272569 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272572 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:29.273886 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272576 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272579 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272581 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272584 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272587 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272590 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272593 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272977 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272983 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272985 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272988 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272991 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272993 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272996 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.272999 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273001 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273004 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273006 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273009 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273012 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:29.274336 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273014 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273017 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273020 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273022 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273024 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273027 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273029 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273032 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273035 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273038 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273040 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273043 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273046 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273048 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273051 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273054 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273057 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273059 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273062 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273065 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:29.274820 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273068 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273070 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273073 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273075 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273078 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273080 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273082 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273085 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273087 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273090 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273095 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273098 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273101 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273104 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273107 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273109 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273112 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273114 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273116 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:29.275602 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273119 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273121 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273124 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273126 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273129 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273131 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273134 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273137 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273140 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273142 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273145 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273148 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273151 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273154 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273156 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273159 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273161 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273164 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273166 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273169 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:29.276138 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273172 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273175 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273177 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273181 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273185 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273188 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273190 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273193 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273195 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273198 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273201 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273203 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273206 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.273208 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274506 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274517 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274525 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274529 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274534 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274540 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274545 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 17:48:29.276643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274549 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274553 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274556 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274559 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274563 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274566 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274569 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274572 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274575 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274593 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274598 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274601 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274606 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274609 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274612 2575 flags.go:64] FLAG: --config-dir="" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274615 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274618 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274622 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274625 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274629 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274632 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274635 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274638 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274641 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274644 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 17:48:29.277280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274647 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274652 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274655 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274658 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274660 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274665 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274668 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274673 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274676 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274680 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274683 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274697 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274701 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274704 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274707 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274710 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274714 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274717 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274720 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274723 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274726 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274729 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274731 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274735 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274738 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 17:48:29.277897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274742 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274745 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274748 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274751 2575 flags.go:64] FLAG: --help="false" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274754 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274758 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274761 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274764 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274767 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274771 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274774 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274777 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274781 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274784 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274834 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274885 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274891 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274897 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.274902 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.276719 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277241 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277247 2575 flags.go:64] FLAG: --lock-file="" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277251 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277255 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 17:48:29.278511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277260 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277275 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277278 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277282 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277285 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277288 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277292 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277295 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277298 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277303 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277307 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277311 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277315 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277318 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277321 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277325 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277328 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277331 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277334 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277342 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277346 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277349 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277352 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 17:48:29.279104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277355 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277361 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277365 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277368 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277371 2575 flags.go:64] FLAG: --port="10250" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277375 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277378 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06fda92afe0e83dcc" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277382 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277385 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277388 2575 flags.go:64] FLAG: --register-node="true" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277391 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277395 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277398 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277401 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277404 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277407 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277412 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277415 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277418 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277421 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277423 2575 flags.go:64] FLAG: --runonce="false" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277426 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277429 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277432 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277435 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277438 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 17:48:29.279673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277442 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277445 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277448 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277451 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277454 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277457 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277459 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277462 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277465 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277468 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277475 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277479 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277482 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277487 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277490 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277493 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277496 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277499 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277502 2575 flags.go:64] FLAG: --v="2" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277506 2575 flags.go:64] FLAG: --version="false" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277510 2575 flags.go:64] FLAG: --vmodule="" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277515 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.277518 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277614 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:29.280331 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277619 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277622 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277625 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277628 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277631 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277634 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277636 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277639 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277641 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277644 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277647 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277650 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277652 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277655 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277658 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277661 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277663 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277666 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277669 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277671 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:29.280952 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277674 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277677 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277680 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277683 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277699 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277702 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277705 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277708 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277711 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277714 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277717 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277719 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277722 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277725 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277728 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277730 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277733 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277735 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277738 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277741 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:29.281449 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277743 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277746 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277748 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277753 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277756 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277759 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277761 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277764 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277767 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277769 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277772 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277775 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277778 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277781 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277783 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277786 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277789 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277791 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277794 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:29.281941 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277797 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277799 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277802 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277805 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277808 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277810 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277813 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277816 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277818 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277821 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277823 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277826 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277829 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277831 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277834 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277836 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277839 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277841 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277845 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:29.282402 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277849 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277852 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277854 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277857 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277859 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277863 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.277865 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:29.282875 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.279242 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:29.285389 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.285370 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 17:48:29.285427 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.285390 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285436 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285442 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285445 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285448 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285452 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285454 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285457 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285460 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:29.285461 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285463 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285466 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285469 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285472 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285475 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285477 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285480 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285483 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285486 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285488 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285491 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285493 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285496 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285498 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285501 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285503 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285506 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285508 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285511 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:29.285697 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285513 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285516 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285518 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285521 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285524 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285527 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285530 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285532 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285534 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285537 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285539 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285542 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285546 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285549 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285552 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285555 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285558 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285561 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285564 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285566 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:29.286162 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285569 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285571 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285573 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285576 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285579 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285581 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285584 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285587 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285589 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285592 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285594 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285597 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285599 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285603 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285607 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285610 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285613 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285617 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285619 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:29.286683 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285622 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285625 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285627 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285629 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285632 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285634 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285637 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285639 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285643 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285646 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285648 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285651 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285654 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285656 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285659 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285662 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285665 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285668 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285670 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:29.287187 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285672 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.285678 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285798 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285805 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285808 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285811 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285813 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285816 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285819 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285822 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285826 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285829 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285834 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285837 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285840 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 17:48:29.287651 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285843 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285846 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285849 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285852 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285854 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285857 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285859 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285863 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285866 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285868 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285871 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285873 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285876 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285879 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285881 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285884 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285886 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285889 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285891 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285894 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 17:48:29.288037 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285896 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285898 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285901 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285903 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285906 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285908 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285911 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285913 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285916 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285918 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285922 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285924 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285926 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285929 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285932 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285934 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285937 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285939 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285942 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285945 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 17:48:29.288536 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285947 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285950 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285952 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285955 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285957 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285960 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285962 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285965 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285968 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285970 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285973 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285976 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285978 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285980 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285984 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285987 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285990 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285992 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285995 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 17:48:29.289024 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.285997 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286000 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286002 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286005 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286008 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286010 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286013 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286015 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286018 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286020 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286022 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286025 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286028 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:29.286030 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.286035 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 17:48:29.289482 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.286136 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 17:48:29.289998 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.289984 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 17:48:29.291308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.291296 2575 server.go:1019] "Starting client certificate rotation" Apr 20 17:48:29.291411 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.291394 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:29.291455 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.291433 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 17:48:29.323544 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.323522 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:29.326086 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.326072 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 17:48:29.343074 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.343053 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 17:48:29.348313 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.348288 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:29.349230 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.349217 2575 log.go:25] "Validated CRI v1 image API" Apr 20 17:48:29.352543 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.352523 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 17:48:29.358657 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.358636 2575 fs.go:135] Filesystem UUIDs: map[19dd2ce5-2e1a-4911-996e-c921960328ef:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9c658623-2c16-4e28-95d0-586df1bad306:/dev/nvme0n1p3] Apr 20 17:48:29.358741 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.358657 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 17:48:29.364411 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.364303 2575 manager.go:217] Machine: {Timestamp:2026-04-20 17:48:29.36227221 +0000 UTC m=+0.455114651 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104774 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28fb9bfe5e692d24534a9b9aa784a0 SystemUUID:ec28fb9b-fe5e-692d-2453-4a9b9aa784a0 BootID:70e7c81d-8227-4650-9ff1-b75bad48f6db Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:10:23:50:eb:b1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:10:23:50:eb:b1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:cf:56:cf:fc:68 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 17:48:29.364411 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.364407 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 17:48:29.364517 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.364484 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 17:48:29.366234 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366214 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 17:48:29.366373 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366237 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-82.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 17:48:29.366420 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366382 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 17:48:29.366420 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366391 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 17:48:29.366420 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366403 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:29.366502 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.366420 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 17:48:29.367769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.367753 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zjpf9" Apr 20 17:48:29.369297 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.369287 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:29.369456 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.369447 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 17:48:29.371964 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.371956 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 17:48:29.371997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.371973 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 17:48:29.371997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.371986 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 17:48:29.371997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.371995 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 17:48:29.372084 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.372004 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 17:48:29.373023 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.373010 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:29.373093 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.373028 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 17:48:29.373392 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.373373 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zjpf9" Apr 20 17:48:29.377166 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.377147 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 17:48:29.378365 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.378350 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 17:48:29.381743 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381730 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 17:48:29.381743 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381748 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 17:48:29.381743 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381755 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381761 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381767 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381773 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381788 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381794 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381800 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381806 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381822 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 17:48:29.381867 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.381832 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 17:48:29.383402 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.383390 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 17:48:29.383402 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.383401 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 17:48:29.384261 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.384247 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:29.386271 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.386242 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:29.387370 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.387354 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 17:48:29.387451 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.387407 2575 server.go:1295] "Started kubelet" Apr 20 17:48:29.387523 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.387484 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 17:48:29.387576 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.387541 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 17:48:29.387625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.387590 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 17:48:29.388189 ip-10-0-137-82 systemd[1]: Started Kubernetes Kubelet. Apr 20 17:48:29.389009 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.388874 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 17:48:29.389990 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.389975 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 17:48:29.390410 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.390386 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-82.ec2.internal" not found Apr 20 17:48:29.395004 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.394986 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:29.395629 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.395612 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 17:48:29.396609 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.396590 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-82.ec2.internal\" not found" Apr 20 17:48:29.397107 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397092 2575 factory.go:55] Registering systemd factory Apr 20 17:48:29.397199 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397115 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 17:48:29.397297 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397284 2575 factory.go:153] Registering CRI-O factory Apr 20 17:48:29.397353 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397299 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 17:48:29.397353 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397342 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 17:48:29.397471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397362 2575 factory.go:103] Registering Raw factory Apr 20 17:48:29.397471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397373 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 17:48:29.397471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397456 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 17:48:29.397471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397459 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 17:48:29.397655 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397484 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 17:48:29.397655 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397576 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 17:48:29.397655 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.397581 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 17:48:29.398194 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.398176 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 17:48:29.398268 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.398245 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:29.398349 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.398332 2575 manager.go:319] Starting recovery of all containers Apr 20 17:48:29.401352 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.401327 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-82.ec2.internal\" not found" node="ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.405985 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.405967 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-82.ec2.internal" not found Apr 20 17:48:29.409806 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.409646 2575 manager.go:324] Recovery completed Apr 20 17:48:29.414517 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.414505 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:29.416196 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416180 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:29.416262 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416209 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:29.416262 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416221 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:29.416658 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416644 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 17:48:29.416730 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416658 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 17:48:29.416730 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.416676 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 17:48:29.418796 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.418782 2575 policy_none.go:49] "None policy: Start" Apr 20 17:48:29.418796 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.418797 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 17:48:29.418923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.418806 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455167 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.455199 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455209 2575 server.go:85] "Starting device plugin registration server" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455402 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455412 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455495 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455567 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.455576 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.456131 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.456162 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-82.ec2.internal\" not found" Apr 20 17:48:29.470802 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.461444 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-82.ec2.internal" not found Apr 20 17:48:29.541567 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.541498 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 17:48:29.542642 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.542616 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 17:48:29.542750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.542651 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 17:48:29.542750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.542703 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 17:48:29.542750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.542710 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 17:48:29.542893 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:29.542748 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 17:48:29.547547 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.547526 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:29.556345 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.556332 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 17:48:29.557377 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.557363 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasSufficientMemory" Apr 20 17:48:29.557468 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.557395 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 17:48:29.557468 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.557412 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeHasSufficientPID" Apr 20 17:48:29.557468 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.557442 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.566800 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.566780 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.643232 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.643200 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal"] Apr 20 17:48:29.648145 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.648129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.648217 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.648137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.677360 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.677341 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.681641 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.681628 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.692018 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.692002 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:29.694339 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.694326 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 17:48:29.798615 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.798564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.798615 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.798590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3e49c9e8f0fc73a695dc3fdeacc6ba89-config\") pod \"kube-apiserver-proxy-ip-10-0-137-82.ec2.internal\" (UID: \"3e49c9e8f0fc73a695dc3fdeacc6ba89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.798615 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.798608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898801 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898888 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898888 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3e49c9e8f0fc73a695dc3fdeacc6ba89-config\") pod \"kube-apiserver-proxy-ip-10-0-137-82.ec2.internal\" (UID: \"3e49c9e8f0fc73a695dc3fdeacc6ba89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898888 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898989 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a699ffd5c0f23bc0832563bc36fe8c03-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal\" (UID: \"a699ffd5c0f23bc0832563bc36fe8c03\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.898989 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.898941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3e49c9e8f0fc73a695dc3fdeacc6ba89-config\") pod \"kube-apiserver-proxy-ip-10-0-137-82.ec2.internal\" (UID: \"3e49c9e8f0fc73a695dc3fdeacc6ba89\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.993601 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.993578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" Apr 20 17:48:29.997258 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:29.997224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" Apr 20 17:48:30.291048 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.291012 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 17:48:30.291635 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.291178 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:30.291635 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.291187 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:30.291635 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.291200 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 17:48:30.372723 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.372678 2575 apiserver.go:52] "Watching apiserver" Apr 20 17:48:30.374856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.374820 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 17:43:29 +0000 UTC" deadline="2027-10-24 14:10:46.917681985 +0000 UTC" Apr 20 17:48:30.374856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.374854 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13244h22m16.542830814s" Apr 20 17:48:30.383377 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.383355 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 17:48:30.384304 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.384282 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2mz86","kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc","openshift-dns/node-resolver-5gv96","openshift-image-registry/node-ca-xgfmk","openshift-multus/network-metrics-daemon-rlsjc","openshift-network-diagnostics/network-check-target-drg8v","openshift-cluster-node-tuning-operator/tuned-bv449","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal","openshift-multus/multus-4cnfn","openshift-multus/multus-additional-cni-plugins-9jtw5","openshift-network-operator/iptables-alerter-7xrzc","openshift-ovn-kubernetes/ovnkube-node-nfzpp"] Apr 20 17:48:30.387242 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.387227 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.389780 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.389760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 17:48:30.389876 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.389783 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wt47c\"" Apr 20 17:48:30.389980 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.389964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 17:48:30.392302 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.392285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.394362 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.394346 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.394455 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.394395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-s8pqs\"" Apr 20 17:48:30.394455 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.394404 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.394566 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.394552 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 17:48:30.394979 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.394965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.395081 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.395067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.395137 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.395118 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 17:48:30.396972 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.396957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8fp5d\"" Apr 20 17:48:30.397206 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.397306 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397289 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.397365 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397291 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.397429 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397414 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-r8fd2\"" Apr 20 17:48:30.397484 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397423 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.397484 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397447 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 17:48:30.397856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.397839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.397923 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.397903 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:30.400407 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:30.400467 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.400452 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:30.400528 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799ea967-f6fc-4097-8bbf-38dcdeaa4107-host\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.400561 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400541 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.400590 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxhwq\" (UniqueName: \"kubernetes.io/projected/d849e597-7012-442a-96d2-7a13be37ff50-kube-api-access-fxhwq\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.400631 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70b177d7-2721-4fb0-85f6-0e4da108fdaf-tmp-dir\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.400666 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zntff\" (UniqueName: \"kubernetes.io/projected/70b177d7-2721-4fb0-85f6-0e4da108fdaf-kube-api-access-zntff\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.400733 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxkzd\" (UniqueName: \"kubernetes.io/projected/799ea967-f6fc-4097-8bbf-38dcdeaa4107-kube-api-access-hxkzd\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.400786 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9d4b966-027c-4c5f-8280-4626703acbf5-konnectivity-ca\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.400786 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.400886 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794dl\" (UniqueName: \"kubernetes.io/projected/07e151c2-7294-492d-b56b-1fc480d9ab69-kube-api-access-794dl\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.400886 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9d4b966-027c-4c5f-8280-4626703acbf5-agent-certs\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.400979 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-socket-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.400979 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-registration-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.400979 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799ea967-f6fc-4097-8bbf-38dcdeaa4107-serviceca\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.401126 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.400964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.401126 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.401016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-device-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.401126 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.401050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-sys-fs\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.401126 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.401073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70b177d7-2721-4fb0-85f6-0e4da108fdaf-hosts-file\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.402943 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.402924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.405640 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.405202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.405640 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.405396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.405640 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.405461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dgw2g\"" Apr 20 17:48:30.405835 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.405761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.407897 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.407880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jmtxs\"" Apr 20 17:48:30.408166 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.408150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.408240 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.408183 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 17:48:30.408407 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.408390 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.408486 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.408416 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.408486 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.408437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 17:48:30.409381 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.409363 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 17:48:30.410247 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.410228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 17:48:30.410338 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.410301 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cnhpv\"" Apr 20 17:48:30.410526 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.410506 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 17:48:30.413805 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.413787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.413805 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.413796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.416039 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416018 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.416157 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 17:48:30.416372 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416353 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 17:48:30.416714 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416673 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 17:48:30.416714 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416707 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-77kwv\"" Apr 20 17:48:30.416894 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.416894 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.416729 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 17:48:30.417038 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.417025 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 17:48:30.417083 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.417058 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jd47w\"" Apr 20 17:48:30.417165 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.417148 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 17:48:30.417220 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.417182 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 17:48:30.431136 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.431115 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xscqs" Apr 20 17:48:30.436907 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.436891 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xscqs" Apr 20 17:48:30.498675 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.498657 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 17:48:30.501531 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxhwq\" (UniqueName: \"kubernetes.io/projected/d849e597-7012-442a-96d2-7a13be37ff50-kube-api-access-fxhwq\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.501625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-cni-binary-copy\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.501625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-socket-dir-parent\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.501749 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-multus\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.501749 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-systemd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.501749 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.501676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-node-log\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-log-socket\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a9f62776-b005-431d-94fc-62905a6ac33b-iptables-alerter-script\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9d4b966-027c-4c5f-8280-4626703acbf5-agent-certs\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-multus-certs\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-kubernetes\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9f62776-b005-431d-94fc-62905a6ac33b-host-slash\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g84n\" (UniqueName: \"kubernetes.io/projected/b17266e6-f66d-4b28-88a9-100b2da4666a-kube-api-access-6g84n\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799ea967-f6fc-4097-8bbf-38dcdeaa4107-serviceca\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-device-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502540 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502579 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-multus-daemon-config\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-modprobe-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zntff\" (UniqueName: \"kubernetes.io/projected/70b177d7-2721-4fb0-85f6-0e4da108fdaf-kube-api-access-zntff\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.504336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502671 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-hostroot\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-lib-modules\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-host\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-netd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b17266e6-f66d-4b28-88a9-100b2da4666a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9d4b966-027c-4c5f-8280-4626703acbf5-konnectivity-ca\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-systemd-units\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-bin\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-794dl\" (UniqueName: \"kubernetes.io/projected/07e151c2-7294-492d-b56b-1fc480d9ab69-kube-api-access-794dl\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.502985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-system-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-netns\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-slash\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-env-overrides\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-sys-fs\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70b177d7-2721-4fb0-85f6-0e4da108fdaf-hosts-file\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.505209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-system-cni-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlzt\" (UniqueName: \"kubernetes.io/projected/a9f62776-b005-431d-94fc-62905a6ac33b-kube-api-access-vzlzt\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-k8s-cni-cncf-io\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-kubelet\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-etc-kubernetes\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-conf\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-socket-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysconfig\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzfcv\" (UniqueName: \"kubernetes.io/projected/573305bc-ab90-4807-aab8-65f52ffaf213-kube-api-access-xzfcv\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-sys\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.505923 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-var-lib-kubelet\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-ovn\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799ea967-f6fc-4097-8bbf-38dcdeaa4107-host\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70b177d7-2721-4fb0-85f6-0e4da108fdaf-tmp-dir\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-cnibin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-etc-tuned\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-tmp\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-netns\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503950 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-etc-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.503978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-config\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxkzd\" (UniqueName: \"kubernetes.io/projected/799ea967-f6fc-4097-8bbf-38dcdeaa4107-kube-api-access-hxkzd\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-systemd\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqn6\" (UniqueName: \"kubernetes.io/projected/f4675b7e-7738-4100-be48-af94288931f3-kube-api-access-5jqn6\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504145 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-kubelet\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-var-lib-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-script-lib\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-registration-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-os-release\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-cnibin\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-os-release\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.504789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/70b177d7-2721-4fb0-85f6-0e4da108fdaf-tmp-dir\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-socket-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/799ea967-f6fc-4097-8bbf-38dcdeaa4107-host\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-registration-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-bin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-sys-fs\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/70b177d7-2721-4fb0-85f6-0e4da108fdaf-hosts-file\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-conf-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d9d4b966-027c-4c5f-8280-4626703acbf5-konnectivity-ca\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.506925 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-run\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/799ea967-f6fc-4097-8bbf-38dcdeaa4107-serviceca\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.505559 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdnfw\" (UniqueName: \"kubernetes.io/projected/a63c7804-111d-41d1-b298-301064054c3b-kube-api-access-vdnfw\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.505746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d849e597-7012-442a-96d2-7a13be37ff50-device-dir\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.505751 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:31.005715642 +0000 UTC m=+2.098558091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:30.507396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.506576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d9d4b966-027c-4c5f-8280-4626703acbf5-agent-certs\") pod \"konnectivity-agent-2mz86\" (UID: \"d9d4b966-027c-4c5f-8280-4626703acbf5\") " pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.509959 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.509942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxhwq\" (UniqueName: \"kubernetes.io/projected/d849e597-7012-442a-96d2-7a13be37ff50-kube-api-access-fxhwq\") pod \"aws-ebs-csi-driver-node-5xlrc\" (UID: \"d849e597-7012-442a-96d2-7a13be37ff50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.513477 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.513457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zntff\" (UniqueName: \"kubernetes.io/projected/70b177d7-2721-4fb0-85f6-0e4da108fdaf-kube-api-access-zntff\") pod \"node-resolver-5gv96\" (UID: \"70b177d7-2721-4fb0-85f6-0e4da108fdaf\") " pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.513774 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.513717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxkzd\" (UniqueName: \"kubernetes.io/projected/799ea967-f6fc-4097-8bbf-38dcdeaa4107-kube-api-access-hxkzd\") pod \"node-ca-xgfmk\" (UID: \"799ea967-f6fc-4097-8bbf-38dcdeaa4107\") " pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.513774 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.513768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-794dl\" (UniqueName: \"kubernetes.io/projected/07e151c2-7294-492d-b56b-1fc480d9ab69-kube-api-access-794dl\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:30.549841 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.549813 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e49c9e8f0fc73a695dc3fdeacc6ba89.slice/crio-a16a142d2e2f0e48841a207e8692f7398460f1adac07f095e2cbe5a83a8ccbe3 WatchSource:0}: Error finding container a16a142d2e2f0e48841a207e8692f7398460f1adac07f095e2cbe5a83a8ccbe3: Status 404 returned error can't find the container with id a16a142d2e2f0e48841a207e8692f7398460f1adac07f095e2cbe5a83a8ccbe3 Apr 20 17:48:30.550286 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.550265 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda699ffd5c0f23bc0832563bc36fe8c03.slice/crio-d4b137a665d6b3f4388a68b7f420595f0f29f6861ac1add37de0bec767c0284d WatchSource:0}: Error finding container d4b137a665d6b3f4388a68b7f420595f0f29f6861ac1add37de0bec767c0284d: Status 404 returned error can't find the container with id d4b137a665d6b3f4388a68b7f420595f0f29f6861ac1add37de0bec767c0284d Apr 20 17:48:30.553650 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.553638 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:48:30.605984 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.605957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-system-cni-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.605984 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.605979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-system-cni-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.606153 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.605992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.606153 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlzt\" (UniqueName: \"kubernetes.io/projected/a9f62776-b005-431d-94fc-62905a6ac33b-kube-api-access-vzlzt\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.606153 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-k8s-cni-cncf-io\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606153 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-kubelet\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-etc-kubernetes\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-k8s-cni-cncf-io\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-conf\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-kubelet\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-etc-kubernetes\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:30.606326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysconfig\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysctl-conf\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzfcv\" (UniqueName: \"kubernetes.io/projected/573305bc-ab90-4807-aab8-65f52ffaf213-kube-api-access-xzfcv\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-sysconfig\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-sys\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-var-lib-kubelet\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-sys\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-ovn\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-var-lib-kubelet\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-cnibin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-ovn\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-cnibin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-etc-tuned\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-tmp\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.606860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-netns\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-etc-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-config\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-systemd\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-netns\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-etc-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqn6\" (UniqueName: \"kubernetes.io/projected/f4675b7e-7738-4100-be48-af94288931f3-kube-api-access-5jqn6\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-systemd\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-kubelet\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-kubelet\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-var-lib-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-script-lib\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-os-release\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-cnibin\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-os-release\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.607751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.606991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-var-lib-openvswitch\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-os-release\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-bin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-cnibin\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-conf-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-bin\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-run\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-conf-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdnfw\" (UniqueName: \"kubernetes.io/projected/a63c7804-111d-41d1-b298-301064054c3b-kube-api-access-vdnfw\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-run\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-config\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-cni-binary-copy\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-socket-dir-parent\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4675b7e-7738-4100-be48-af94288931f3-os-release\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-multus-socket-dir-parent\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.608599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-multus\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-systemd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-ovnkube-script-lib\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-node-log\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-var-lib-cni-multus\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-node-log\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-run-systemd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-log-socket\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-log-socket\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-run-ovn-kubernetes\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a9f62776-b005-431d-94fc-62905a6ac33b-iptables-alerter-script\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-multus-certs\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-kubernetes\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9f62776-b005-431d-94fc-62905a6ac33b-host-slash\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-multus-certs\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g84n\" (UniqueName: \"kubernetes.io/projected/b17266e6-f66d-4b28-88a9-100b2da4666a-kube-api-access-6g84n\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a9f62776-b005-431d-94fc-62905a6ac33b-host-slash\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-multus-daemon-config\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-kubernetes\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-modprobe-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-cni-binary-copy\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-hostroot\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-lib-modules\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-host\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-etc-modprobe-d\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-netd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b17266e6-f66d-4b28-88a9-100b2da4666a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-lib-modules\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-systemd-units\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-netd\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.607996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-bin\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-hostroot\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-system-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.609737 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a63c7804-111d-41d1-b298-301064054c3b-host\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608041 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a9f62776-b005-431d-94fc-62905a6ac33b-iptables-alerter-script\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-netns\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-slash\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-cni-bin\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-systemd-units\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-env-overrides\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-host-run-netns\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/573305bc-ab90-4807-aab8-65f52ffaf213-system-cni-dir\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b17266e6-f66d-4b28-88a9-100b2da4666a-host-slash\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/573305bc-ab90-4807-aab8-65f52ffaf213-multus-daemon-config\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4675b7e-7738-4100-be48-af94288931f3-cni-binary-copy\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b17266e6-f66d-4b28-88a9-100b2da4666a-env-overrides\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.608910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-etc-tuned\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.609052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a63c7804-111d-41d1-b298-301064054c3b-tmp\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.610216 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.609890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b17266e6-f66d-4b28-88a9-100b2da4666a-ovn-node-metrics-cert\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.612853 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.612835 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:30.612907 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.612859 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:30.612907 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.612878 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:30.612989 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:30.612956 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:31.112940023 +0000 UTC m=+2.205782467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:30.614549 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.614527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqn6\" (UniqueName: \"kubernetes.io/projected/f4675b7e-7738-4100-be48-af94288931f3-kube-api-access-5jqn6\") pod \"multus-additional-cni-plugins-9jtw5\" (UID: \"f4675b7e-7738-4100-be48-af94288931f3\") " pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.614862 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.614844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdnfw\" (UniqueName: \"kubernetes.io/projected/a63c7804-111d-41d1-b298-301064054c3b-kube-api-access-vdnfw\") pod \"tuned-bv449\" (UID: \"a63c7804-111d-41d1-b298-301064054c3b\") " pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.614929 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.614897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzfcv\" (UniqueName: \"kubernetes.io/projected/573305bc-ab90-4807-aab8-65f52ffaf213-kube-api-access-xzfcv\") pod \"multus-4cnfn\" (UID: \"573305bc-ab90-4807-aab8-65f52ffaf213\") " pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.615530 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.615512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlzt\" (UniqueName: \"kubernetes.io/projected/a9f62776-b005-431d-94fc-62905a6ac33b-kube-api-access-vzlzt\") pod \"iptables-alerter-7xrzc\" (UID: \"a9f62776-b005-431d-94fc-62905a6ac33b\") " pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.615987 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.615970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g84n\" (UniqueName: \"kubernetes.io/projected/b17266e6-f66d-4b28-88a9-100b2da4666a-kube-api-access-6g84n\") pod \"ovnkube-node-nfzpp\" (UID: \"b17266e6-f66d-4b28-88a9-100b2da4666a\") " pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.707728 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.707707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:30.713958 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.713936 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d4b966_027c_4c5f_8280_4626703acbf5.slice/crio-e7ba05eda1cb789da1c0658ea8cd2714dbc7a000bd4e9d7faffbaeb292115c09 WatchSource:0}: Error finding container e7ba05eda1cb789da1c0658ea8cd2714dbc7a000bd4e9d7faffbaeb292115c09: Status 404 returned error can't find the container with id e7ba05eda1cb789da1c0658ea8cd2714dbc7a000bd4e9d7faffbaeb292115c09 Apr 20 17:48:30.741698 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.741664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" Apr 20 17:48:30.746176 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.746161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5gv96" Apr 20 17:48:30.747716 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.747684 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd849e597_7012_442a_96d2_7a13be37ff50.slice/crio-c32cefe1c8e0165056427d479f902391c7c8e1ba45f4492dab2db3dde984993a WatchSource:0}: Error finding container c32cefe1c8e0165056427d479f902391c7c8e1ba45f4492dab2db3dde984993a: Status 404 returned error can't find the container with id c32cefe1c8e0165056427d479f902391c7c8e1ba45f4492dab2db3dde984993a Apr 20 17:48:30.752180 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.752159 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70b177d7_2721_4fb0_85f6_0e4da108fdaf.slice/crio-1f1d27d51a594873e6644a0027d72484cc9175623cf6a5ae8c5b1a51eccadbe1 WatchSource:0}: Error finding container 1f1d27d51a594873e6644a0027d72484cc9175623cf6a5ae8c5b1a51eccadbe1: Status 404 returned error can't find the container with id 1f1d27d51a594873e6644a0027d72484cc9175623cf6a5ae8c5b1a51eccadbe1 Apr 20 17:48:30.764111 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.764086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xgfmk" Apr 20 17:48:30.770629 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.770609 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799ea967_f6fc_4097_8bbf_38dcdeaa4107.slice/crio-c0a1bf6481bfca94cc98b54c0092581fd1b9d2f169c5846eb2f7b1226e21fd68 WatchSource:0}: Error finding container c0a1bf6481bfca94cc98b54c0092581fd1b9d2f169c5846eb2f7b1226e21fd68: Status 404 returned error can't find the container with id c0a1bf6481bfca94cc98b54c0092581fd1b9d2f169c5846eb2f7b1226e21fd68 Apr 20 17:48:30.779013 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.778998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bv449" Apr 20 17:48:30.783972 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.783949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda63c7804_111d_41d1_b298_301064054c3b.slice/crio-7889f644ff45d1474075ef50686848c8a5f7dff21ac7bdfaf3de230b331a0324 WatchSource:0}: Error finding container 7889f644ff45d1474075ef50686848c8a5f7dff21ac7bdfaf3de230b331a0324: Status 404 returned error can't find the container with id 7889f644ff45d1474075ef50686848c8a5f7dff21ac7bdfaf3de230b331a0324 Apr 20 17:48:30.796300 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.796258 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4cnfn" Apr 20 17:48:30.801131 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.800815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" Apr 20 17:48:30.802592 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.802568 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573305bc_ab90_4807_aab8_65f52ffaf213.slice/crio-46f66156f43a85066ad6ca90acbff23ba3c3e21a612511b71440dd810944949b WatchSource:0}: Error finding container 46f66156f43a85066ad6ca90acbff23ba3c3e21a612511b71440dd810944949b: Status 404 returned error can't find the container with id 46f66156f43a85066ad6ca90acbff23ba3c3e21a612511b71440dd810944949b Apr 20 17:48:30.806760 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.806740 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4675b7e_7738_4100_be48_af94288931f3.slice/crio-b74af065142c8e2e5884615f8a8c6d56c369468389fd96cb9d349f58b0cf27d5 WatchSource:0}: Error finding container b74af065142c8e2e5884615f8a8c6d56c369468389fd96cb9d349f58b0cf27d5: Status 404 returned error can't find the container with id b74af065142c8e2e5884615f8a8c6d56c369468389fd96cb9d349f58b0cf27d5 Apr 20 17:48:30.827569 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.827545 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7xrzc" Apr 20 17:48:30.832141 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:30.832121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:30.833276 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.833256 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9f62776_b005_431d_94fc_62905a6ac33b.slice/crio-2eca748194fae733f516c298d75dc4a7cb4492ddabe1f0c9d31b08c317051333 WatchSource:0}: Error finding container 2eca748194fae733f516c298d75dc4a7cb4492ddabe1f0c9d31b08c317051333: Status 404 returned error can't find the container with id 2eca748194fae733f516c298d75dc4a7cb4492ddabe1f0c9d31b08c317051333 Apr 20 17:48:30.837706 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:48:30.837673 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb17266e6_f66d_4b28_88a9_100b2da4666a.slice/crio-c3a6d020a18525e8cc770e0a1e89338e7f16289f62efb0cd09e42d4e82a95689 WatchSource:0}: Error finding container c3a6d020a18525e8cc770e0a1e89338e7f16289f62efb0cd09e42d4e82a95689: Status 404 returned error can't find the container with id c3a6d020a18525e8cc770e0a1e89338e7f16289f62efb0cd09e42d4e82a95689 Apr 20 17:48:31.010953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.010917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:31.011110 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.011077 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:31.011182 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.011138 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:32.01111877 +0000 UTC m=+3.103961202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:31.156342 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.156309 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:31.212671 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.212638 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:31.213184 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.213160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:31.213381 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.213361 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:31.213463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.213387 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:31.213463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.213419 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:31.213569 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:31.213484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:32.21345882 +0000 UTC m=+3.306301251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:31.231034 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.231008 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 17:48:31.437772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.437621 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:30 +0000 UTC" deadline="2027-09-18 22:44:11.431820628 +0000 UTC" Apr 20 17:48:31.437772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.437667 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12388h55m39.994157933s" Apr 20 17:48:31.565622 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.565562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"c3a6d020a18525e8cc770e0a1e89338e7f16289f62efb0cd09e42d4e82a95689"} Apr 20 17:48:31.581103 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.581066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7xrzc" event={"ID":"a9f62776-b005-431d-94fc-62905a6ac33b","Type":"ContainerStarted","Data":"2eca748194fae733f516c298d75dc4a7cb4492ddabe1f0c9d31b08c317051333"} Apr 20 17:48:31.604238 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.604204 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerStarted","Data":"b74af065142c8e2e5884615f8a8c6d56c369468389fd96cb9d349f58b0cf27d5"} Apr 20 17:48:31.618113 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.618054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bv449" event={"ID":"a63c7804-111d-41d1-b298-301064054c3b","Type":"ContainerStarted","Data":"7889f644ff45d1474075ef50686848c8a5f7dff21ac7bdfaf3de230b331a0324"} Apr 20 17:48:31.630283 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.630249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5gv96" event={"ID":"70b177d7-2721-4fb0-85f6-0e4da108fdaf","Type":"ContainerStarted","Data":"1f1d27d51a594873e6644a0027d72484cc9175623cf6a5ae8c5b1a51eccadbe1"} Apr 20 17:48:31.634003 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.633975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" event={"ID":"a699ffd5c0f23bc0832563bc36fe8c03","Type":"ContainerStarted","Data":"d4b137a665d6b3f4388a68b7f420595f0f29f6861ac1add37de0bec767c0284d"} Apr 20 17:48:31.659089 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.659058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4cnfn" event={"ID":"573305bc-ab90-4807-aab8-65f52ffaf213","Type":"ContainerStarted","Data":"46f66156f43a85066ad6ca90acbff23ba3c3e21a612511b71440dd810944949b"} Apr 20 17:48:31.672135 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.672093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xgfmk" event={"ID":"799ea967-f6fc-4097-8bbf-38dcdeaa4107","Type":"ContainerStarted","Data":"c0a1bf6481bfca94cc98b54c0092581fd1b9d2f169c5846eb2f7b1226e21fd68"} Apr 20 17:48:31.698481 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.698391 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" event={"ID":"d849e597-7012-442a-96d2-7a13be37ff50","Type":"ContainerStarted","Data":"c32cefe1c8e0165056427d479f902391c7c8e1ba45f4492dab2db3dde984993a"} Apr 20 17:48:31.705247 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.705181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2mz86" event={"ID":"d9d4b966-027c-4c5f-8280-4626703acbf5","Type":"ContainerStarted","Data":"e7ba05eda1cb789da1c0658ea8cd2714dbc7a000bd4e9d7faffbaeb292115c09"} Apr 20 17:48:31.713518 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:31.713492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" event={"ID":"3e49c9e8f0fc73a695dc3fdeacc6ba89","Type":"ContainerStarted","Data":"a16a142d2e2f0e48841a207e8692f7398460f1adac07f095e2cbe5a83a8ccbe3"} Apr 20 17:48:32.021285 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.021205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:32.021444 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.021401 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:32.021504 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.021468 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:34.021450104 +0000 UTC m=+5.114292538 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:32.223747 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.223147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:32.223747 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.223299 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:32.223747 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.223319 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:32.223747 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.223332 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:32.223747 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.223397 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:34.223377346 +0000 UTC m=+5.316219797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:32.438323 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.438268 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 17:43:30 +0000 UTC" deadline="2028-02-04 17:25:22.326593944 +0000 UTC" Apr 20 17:48:32.438323 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.438313 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15719h36m49.888285826s" Apr 20 17:48:32.543463 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.543430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:32.543632 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.543570 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:32.544113 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:32.544093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:32.544225 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:32.544198 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:34.044183 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:34.044097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:34.044645 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.044264 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:34.044645 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.044342 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:38.044317906 +0000 UTC m=+9.137160359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:34.246670 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:34.246098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:34.246670 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.246247 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:34.246670 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.246266 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:34.246670 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.246278 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:34.246670 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.246334 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:38.246317288 +0000 UTC m=+9.339159721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:34.542970 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:34.542938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:34.543137 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.543061 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:34.543206 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:34.543139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:34.543305 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:34.543254 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:36.543147 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:36.543056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:36.543587 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:36.543176 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:36.543587 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:36.543240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:36.543587 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:36.543360 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:38.079305 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:38.079138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:38.079305 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.079258 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:38.079880 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.079331 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:46.07931213 +0000 UTC m=+17.172154557 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:38.280426 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:38.280386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:38.280654 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.280556 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:38.280654 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.280574 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:38.280654 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.280587 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:38.280654 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.280645 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:48:46.280625755 +0000 UTC m=+17.373468188 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:38.543564 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:38.543479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:38.543759 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.543627 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:38.544162 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:38.543993 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:38.544162 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:38.544108 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:40.543018 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:40.542999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:40.543304 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:40.543020 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:40.543304 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:40.543106 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:40.543304 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:40.543228 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:42.543736 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:42.543701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:42.543736 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:42.543720 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:42.544227 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:42.543821 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:42.544227 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:42.543919 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:44.543742 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:44.543547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:44.544198 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:44.543604 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:44.544198 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:44.543810 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:44.544198 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:44.543891 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:46.139991 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:46.139940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:46.140496 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.140093 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:46.140496 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.140178 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:02.140157327 +0000 UTC m=+33.232999756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:48:46.341184 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:46.341145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:46.341360 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.341339 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:48:46.341407 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.341370 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:48:46.341407 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.341384 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:46.341481 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.341447 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:02.341429119 +0000 UTC m=+33.434271551 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:48:46.543192 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:46.543120 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:46.543324 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:46.543121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:46.543324 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.543259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:46.543324 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:46.543302 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:48.543588 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:48.543554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:48.543912 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:48.543561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:48.543912 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:48.543754 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:48.543912 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:48.543643 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:49.750939 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.750792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751191 2575 generic.go:358] "Generic (PLEG): container finished" podID="b17266e6-f66d-4b28-88a9-100b2da4666a" containerID="e571430518369b04e6628657ef29e1cae7769c8de16986996e3380a44c3edf34" exitCode=1 Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"d833555b3ae776a5267e51785cd6f873ac353c67d24f867f9bf5d40f2ba4af2c"} Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751274 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"6b699672ac40b51fcd7350ce214a076071daf5eeba818dbe0e0692070533a311"} Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"18fe06785fc4028a950a52480a5494888485c8f0ad940886f5e5f1bfdc35cd3f"} Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"a372961f69a69c796511865ba75adc592fabc5726daf7ebf2a5e1ac231bdcb38"} Apr 20 17:48:49.751299 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerDied","Data":"e571430518369b04e6628657ef29e1cae7769c8de16986996e3380a44c3edf34"} Apr 20 17:48:49.751489 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.751312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"6f21256dcdfbd0e884855f5c6b3bb4745d0914271f2ade5759758a074dae1259"} Apr 20 17:48:49.752480 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.752459 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="ea30685b1a08dc711e9dd66f0ce7a586d669f11facdbfa2a8db4a16e55f4a716" exitCode=0 Apr 20 17:48:49.752566 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.752519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"ea30685b1a08dc711e9dd66f0ce7a586d669f11facdbfa2a8db4a16e55f4a716"} Apr 20 17:48:49.753849 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.753824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bv449" event={"ID":"a63c7804-111d-41d1-b298-301064054c3b","Type":"ContainerStarted","Data":"962343ac8a696159e6bd1c5b98f3a9eb6026382b03255d0f15a462be28eb41d6"} Apr 20 17:48:49.755095 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.755075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5gv96" event={"ID":"70b177d7-2721-4fb0-85f6-0e4da108fdaf","Type":"ContainerStarted","Data":"765bdebde09f4d06b2e7244d7744489ad365dc9271231a1841adf197f1366757"} Apr 20 17:48:49.756384 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.756367 2575 generic.go:358] "Generic (PLEG): container finished" podID="a699ffd5c0f23bc0832563bc36fe8c03" containerID="2795000f6719fd2a7d76649f8ffe267741466aca83e4ecf09590b0fe3b548783" exitCode=0 Apr 20 17:48:49.756464 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.756422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" event={"ID":"a699ffd5c0f23bc0832563bc36fe8c03","Type":"ContainerDied","Data":"2795000f6719fd2a7d76649f8ffe267741466aca83e4ecf09590b0fe3b548783"} Apr 20 17:48:49.757715 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.757671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4cnfn" event={"ID":"573305bc-ab90-4807-aab8-65f52ffaf213","Type":"ContainerStarted","Data":"5a288161c2d78dc04986eab93668993c85e8e9502bcc600f8ce30e801373936b"} Apr 20 17:48:49.760836 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.760813 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xgfmk" event={"ID":"799ea967-f6fc-4097-8bbf-38dcdeaa4107","Type":"ContainerStarted","Data":"cc8f06406912ba016da1121b28ec902cff976d8f7a601970a2beddbd7844811f"} Apr 20 17:48:49.762308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.762289 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" event={"ID":"d849e597-7012-442a-96d2-7a13be37ff50","Type":"ContainerStarted","Data":"bdee7903eca0b3c52f474594da0ecab4d0de048795c920a099c1f2a1bba4f9c4"} Apr 20 17:48:49.763673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.763650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2mz86" event={"ID":"d9d4b966-027c-4c5f-8280-4626703acbf5","Type":"ContainerStarted","Data":"eff46996015ae8d107941cd51c3056888ca7a22872045f4a6e1ce84f58dd3fd6"} Apr 20 17:48:49.764921 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.764901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" event={"ID":"3e49c9e8f0fc73a695dc3fdeacc6ba89","Type":"ContainerStarted","Data":"4146546f9272fe297e0bbc08548c391379658943da59a680929d9fa2d3702fc1"} Apr 20 17:48:49.793188 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.793142 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4cnfn" podStartSLOduration=2.594587136 podStartE2EDuration="20.793131351s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.804024037 +0000 UTC m=+1.896866480" lastFinishedPulling="2026-04-20 17:48:49.002568254 +0000 UTC m=+20.095410695" observedRunningTime="2026-04-20 17:48:49.792886027 +0000 UTC m=+20.885728481" watchObservedRunningTime="2026-04-20 17:48:49.793131351 +0000 UTC m=+20.885973800" Apr 20 17:48:49.832471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.832430 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2mz86" podStartSLOduration=6.656268026 podStartE2EDuration="20.832414365s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.715521274 +0000 UTC m=+1.808363706" lastFinishedPulling="2026-04-20 17:48:44.8916676 +0000 UTC m=+15.984510045" observedRunningTime="2026-04-20 17:48:49.818506458 +0000 UTC m=+20.911348909" watchObservedRunningTime="2026-04-20 17:48:49.832414365 +0000 UTC m=+20.925256809" Apr 20 17:48:49.832673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.832647 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5gv96" podStartSLOduration=11.019163952 podStartE2EDuration="20.832641056s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.753635857 +0000 UTC m=+1.846478285" lastFinishedPulling="2026-04-20 17:48:40.567112947 +0000 UTC m=+11.659955389" observedRunningTime="2026-04-20 17:48:49.831879827 +0000 UTC m=+20.924722276" watchObservedRunningTime="2026-04-20 17:48:49.832641056 +0000 UTC m=+20.925483506" Apr 20 17:48:49.846328 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.846298 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xgfmk" podStartSLOduration=2.945585796 podStartE2EDuration="20.846288352s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.773546107 +0000 UTC m=+1.866388534" lastFinishedPulling="2026-04-20 17:48:48.674248646 +0000 UTC m=+19.767091090" observedRunningTime="2026-04-20 17:48:49.846217478 +0000 UTC m=+20.939059928" watchObservedRunningTime="2026-04-20 17:48:49.846288352 +0000 UTC m=+20.939130802" Apr 20 17:48:49.862932 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.862900 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bv449" podStartSLOduration=2.973014233 podStartE2EDuration="20.862890465s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.785144115 +0000 UTC m=+1.877986543" lastFinishedPulling="2026-04-20 17:48:48.675020343 +0000 UTC m=+19.767862775" observedRunningTime="2026-04-20 17:48:49.862431868 +0000 UTC m=+20.955274328" watchObservedRunningTime="2026-04-20 17:48:49.862890465 +0000 UTC m=+20.955732915" Apr 20 17:48:49.886898 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:49.886857 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-82.ec2.internal" podStartSLOduration=20.886841277 podStartE2EDuration="20.886841277s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:49.886242506 +0000 UTC m=+20.979084955" watchObservedRunningTime="2026-04-20 17:48:49.886841277 +0000 UTC m=+20.979683728" Apr 20 17:48:50.520174 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.520149 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 17:48:50.543082 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.543065 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:50.543169 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.543113 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:50.543220 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:50.543200 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:50.543343 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:50.543316 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:50.768570 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.768477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7xrzc" event={"ID":"a9f62776-b005-431d-94fc-62905a6ac33b","Type":"ContainerStarted","Data":"bc71d6dfb19d94eab48c994f92e3179f007cbda62b6655447902cd51114c2a6d"} Apr 20 17:48:50.770199 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.770170 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" event={"ID":"a699ffd5c0f23bc0832563bc36fe8c03","Type":"ContainerStarted","Data":"a14c2a43f7683d9e174973488983d52ad567dabd5c9bc02935ad2673f6a014c3"} Apr 20 17:48:50.772091 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.772057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" event={"ID":"d849e597-7012-442a-96d2-7a13be37ff50","Type":"ContainerStarted","Data":"7556f6f274deea304212e8a6de59e4fbc08ac950b54d6173853efe06fc58d865"} Apr 20 17:48:50.782352 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.782302 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7xrzc" podStartSLOduration=3.943491745 podStartE2EDuration="21.782286248s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.83484737 +0000 UTC m=+1.927689798" lastFinishedPulling="2026-04-20 17:48:48.673641858 +0000 UTC m=+19.766484301" observedRunningTime="2026-04-20 17:48:50.781573193 +0000 UTC m=+21.874415643" watchObservedRunningTime="2026-04-20 17:48:50.782286248 +0000 UTC m=+21.875128699" Apr 20 17:48:50.798886 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:50.798837 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-82.ec2.internal" podStartSLOduration=21.798823289 podStartE2EDuration="21.798823289s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:48:50.798706951 +0000 UTC m=+21.891549402" watchObservedRunningTime="2026-04-20 17:48:50.798823289 +0000 UTC m=+21.891665740" Apr 20 17:48:51.468965 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.468827 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T17:48:50.520170749Z","UUID":"5db5d2ac-0b7c-4cea-ac49-87c60515364b","Handler":null,"Name":"","Endpoint":""} Apr 20 17:48:51.471703 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.471670 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 17:48:51.471703 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.471709 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 17:48:51.775699 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.775662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" event={"ID":"d849e597-7012-442a-96d2-7a13be37ff50","Type":"ContainerStarted","Data":"345ccf17980516c5c1911b826d7c9d7e8bcb8ef8bb15273ac87c6f77d6307289"} Apr 20 17:48:51.778775 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.778752 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:48:51.779107 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.779045 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"8a4a1336c13d8bdf56a7d1061d16b59f75110a510b98de2d08d2190219548bc9"} Apr 20 17:48:51.802655 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:51.802609 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xlrc" podStartSLOduration=2.264875133 podStartE2EDuration="22.802595715s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.749214085 +0000 UTC m=+1.842056513" lastFinishedPulling="2026-04-20 17:48:51.286934667 +0000 UTC m=+22.379777095" observedRunningTime="2026-04-20 17:48:51.802158328 +0000 UTC m=+22.895000779" watchObservedRunningTime="2026-04-20 17:48:51.802595715 +0000 UTC m=+22.895438185" Apr 20 17:48:52.393371 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.393340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:52.393996 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.393971 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:52.542943 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.542909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:52.543115 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.542909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:52.543115 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:52.543028 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:52.543220 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:52.543123 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:52.781265 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.781182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:52.781662 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:52.781560 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2mz86" Apr 20 17:48:54.543727 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.543510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:54.544345 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.543518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:54.544345 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:54.543758 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:54.544345 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:54.543822 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:54.785860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.785827 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="840bd693f9a5edb852d11d7ce8bc5b223bf30500696c12adbdd6c91450c89c4b" exitCode=0 Apr 20 17:48:54.786015 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.785907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"840bd693f9a5edb852d11d7ce8bc5b223bf30500696c12adbdd6c91450c89c4b"} Apr 20 17:48:54.790038 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.790020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:48:54.790405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.790384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"3fab5f482efe3c68ebe53d720ce35bd9a9b2485ca606b96d08ab240924b97883"} Apr 20 17:48:54.790900 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.790878 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:54.790900 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.790908 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:54.791071 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.791054 2575 scope.go:117] "RemoveContainer" containerID="e571430518369b04e6628657ef29e1cae7769c8de16986996e3380a44c3edf34" Apr 20 17:48:54.805844 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:54.805827 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:55.718317 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.718154 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-drg8v"] Apr 20 17:48:55.718745 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.718386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:55.718745 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:55.718493 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:55.720821 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.720795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rlsjc"] Apr 20 17:48:55.720921 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.720886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:55.721007 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:55.720989 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:55.795036 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.795012 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:48:55.795392 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.795371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" event={"ID":"b17266e6-f66d-4b28-88a9-100b2da4666a","Type":"ContainerStarted","Data":"080a45c03a9760c2870f5c4c3db747926ed6e419989c1ba4759c02000cd3fdc6"} Apr 20 17:48:55.795734 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.795680 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:55.797211 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.797187 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="daf1322ccbdf2780ce4b5ae93ee25907a2fa506bb51cf24bfe8f903ef4838aef" exitCode=0 Apr 20 17:48:55.797318 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.797218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"daf1322ccbdf2780ce4b5ae93ee25907a2fa506bb51cf24bfe8f903ef4838aef"} Apr 20 17:48:55.811570 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.811546 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:48:55.860978 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:55.860940 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" podStartSLOduration=9.008893907 podStartE2EDuration="26.860929908s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.839169812 +0000 UTC m=+1.932012242" lastFinishedPulling="2026-04-20 17:48:48.691205811 +0000 UTC m=+19.784048243" observedRunningTime="2026-04-20 17:48:55.849171449 +0000 UTC m=+26.942013910" watchObservedRunningTime="2026-04-20 17:48:55.860929908 +0000 UTC m=+26.953772358" Apr 20 17:48:56.801002 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:56.800969 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="95cdca4919fcf93c723dd859da88f49f7f64aed7d4e8b33afa04d1dc2fc39088" exitCode=0 Apr 20 17:48:56.801357 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:56.801060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"95cdca4919fcf93c723dd859da88f49f7f64aed7d4e8b33afa04d1dc2fc39088"} Apr 20 17:48:57.543896 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:57.543868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:57.543896 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:57.543890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:57.544134 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:57.543996 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:48:57.544134 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:57.544086 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:59.544040 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:59.543994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:48:59.544497 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:59.544120 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:48:59.544497 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:48:59.544164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:48:59.544497 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:48:59.544270 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:49:01.543406 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.543184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:49:01.543895 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.543244 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:01.543895 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.543522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:49:01.543895 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.543575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-drg8v" podUID="5d45e323-232c-48df-b245-96c179fcb1e3" Apr 20 17:49:01.695298 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.695271 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-82.ec2.internal" event="NodeReady" Apr 20 17:49:01.695455 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.695436 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 17:49:01.736938 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.736912 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6kmcw"] Apr 20 17:49:01.759966 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.759942 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kj29z"] Apr 20 17:49:01.760192 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.760171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.762830 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.762806 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 17:49:01.762936 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.762834 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:49:01.762936 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.762810 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 17:49:01.773067 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.773051 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kmcw"] Apr 20 17:49:01.773067 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.773069 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kj29z"] Apr 20 17:49:01.773194 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.773141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:01.776075 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.775945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 17:49:01.776306 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.776290 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 17:49:01.776703 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.776676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:49:01.776931 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.776914 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 17:49:01.852333 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28px\" (UniqueName: \"kubernetes.io/projected/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-kube-api-access-m28px\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:01.852333 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852290 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dc154c0-3907-4457-b37d-d6e899b39fff-config-volume\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.852529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dc154c0-3907-4457-b37d-d6e899b39fff-tmp-dir\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.852529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdthh\" (UniqueName: \"kubernetes.io/projected/4dc154c0-3907-4457-b37d-d6e899b39fff-kube-api-access-kdthh\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.852529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.852529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.852450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:01.953672 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m28px\" (UniqueName: \"kubernetes.io/projected/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-kube-api-access-m28px\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:01.953672 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dc154c0-3907-4457-b37d-d6e899b39fff-config-volume\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.953898 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dc154c0-3907-4457-b37d-d6e899b39fff-tmp-dir\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.953898 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdthh\" (UniqueName: \"kubernetes.io/projected/4dc154c0-3907-4457-b37d-d6e899b39fff-kube-api-access-kdthh\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.953898 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.953898 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.953893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:01.954083 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.954021 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:01.954083 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.954031 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:01.954083 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.954081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:02.454061741 +0000 UTC m=+33.546904171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:01.954233 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:01.954100 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:02.454090569 +0000 UTC m=+33.546932997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:01.954233 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.954117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4dc154c0-3907-4457-b37d-d6e899b39fff-tmp-dir\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.954314 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.954274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4dc154c0-3907-4457-b37d-d6e899b39fff-config-volume\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.964549 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.964503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdthh\" (UniqueName: \"kubernetes.io/projected/4dc154c0-3907-4457-b37d-d6e899b39fff-kube-api-access-kdthh\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:01.964707 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:01.964558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28px\" (UniqueName: \"kubernetes.io/projected/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-kube-api-access-m28px\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:02.155625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:02.155588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:49:02.155796 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.155764 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:49:02.155863 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.155851 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:34.155835468 +0000 UTC m=+65.248677895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 17:49:02.357278 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:02.357240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:02.357467 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.357388 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 17:49:02.357467 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.357408 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 17:49:02.357467 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.357421 2575 projected.go:194] Error preparing data for projected volume kube-api-access-kp4hc for pod openshift-network-diagnostics/network-check-target-drg8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:49:02.357618 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.357485 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc podName:5d45e323-232c-48df-b245-96c179fcb1e3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:34.357466377 +0000 UTC m=+65.450308806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kp4hc" (UniqueName: "kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc") pod "network-check-target-drg8v" (UID: "5d45e323-232c-48df-b245-96c179fcb1e3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 17:49:02.457712 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:02.457632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:02.457712 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:02.457672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:02.457866 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.457783 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:02.457866 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.457841 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:03.457827014 +0000 UTC m=+34.550669443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:02.457947 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.457789 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:02.457947 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:02.457922 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:03.457910814 +0000 UTC m=+34.550753241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:03.465082 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.465051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:03.465082 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.465089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:03.465806 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:03.465202 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:03.465806 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:03.465267 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:05.465251924 +0000 UTC m=+36.558094351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:03.465806 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:03.465265 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:03.465806 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:03.465336 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:05.465318121 +0000 UTC m=+36.558160757 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:03.543658 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.543625 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:49:03.543813 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.543626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:03.546592 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.546574 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 17:49:03.547745 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.547722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 17:49:03.547856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.547762 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:49:03.547856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.547792 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 17:49:03.547856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.547764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-swr7p\"" Apr 20 17:49:03.815699 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.815618 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="bb687c3388a2146114d44d0749508729ca1c90e738e9369a44c830ef2e6ad4d7" exitCode=0 Apr 20 17:49:03.815812 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:03.815707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"bb687c3388a2146114d44d0749508729ca1c90e738e9369a44c830ef2e6ad4d7"} Apr 20 17:49:04.820363 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:04.820329 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4675b7e-7738-4100-be48-af94288931f3" containerID="3df816617a22dfb4c66759e5894af4855490c641e67f840ee44215ebe84bf29e" exitCode=0 Apr 20 17:49:04.820833 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:04.820374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerDied","Data":"3df816617a22dfb4c66759e5894af4855490c641e67f840ee44215ebe84bf29e"} Apr 20 17:49:05.478132 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:05.478095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:05.478132 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:05.478138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:05.478327 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:05.478226 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:05.478327 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:05.478232 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:05.478327 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:05.478278 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:09.478263432 +0000 UTC m=+40.571105860 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:05.478327 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:05.478291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:09.478285679 +0000 UTC m=+40.571128106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:05.824959 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:05.824926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" event={"ID":"f4675b7e-7738-4100-be48-af94288931f3","Type":"ContainerStarted","Data":"39c4e7df0f1a4a57ed10a2956b9a1d7d4ab869ccea386f89ca7f239be820b81b"} Apr 20 17:49:05.848755 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:05.848712 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9jtw5" podStartSLOduration=4.910268559 podStartE2EDuration="36.84868076s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:48:30.80829255 +0000 UTC m=+1.901134991" lastFinishedPulling="2026-04-20 17:49:02.746704754 +0000 UTC m=+33.839547192" observedRunningTime="2026-04-20 17:49:05.846827213 +0000 UTC m=+36.939669664" watchObservedRunningTime="2026-04-20 17:49:05.84868076 +0000 UTC m=+36.941523210" Apr 20 17:49:09.505542 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:09.505503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:09.505973 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:09.505553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:09.505973 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:09.505638 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:09.505973 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:09.505649 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:09.505973 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:09.505712 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:17.505679841 +0000 UTC m=+48.598522271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:09.505973 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:09.505729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:17.50572174 +0000 UTC m=+48.598564168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:17.558490 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:17.558458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:17.558490 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:17.558497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:17.558892 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:17.558590 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:17.558892 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:17.558595 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:17.558892 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:17.558637 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:49:33.558623986 +0000 UTC m=+64.651466414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:17.558892 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:17.558649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:49:33.558643602 +0000 UTC m=+64.651486030 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:27.813454 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:27.813429 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nfzpp" Apr 20 17:49:33.653943 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:33.653909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:49:33.654463 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:33.653951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:49:33.654463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:33.654056 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:49:33.654463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:33.654109 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:05.654096143 +0000 UTC m=+96.746938570 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:49:33.654463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:33.654056 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:49:33.654463 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:33.654146 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:50:05.654135325 +0000 UTC m=+96.746977754 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:49:34.158289 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.158259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:49:34.160857 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.160840 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 17:49:34.168894 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:34.168880 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:49:34.168948 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:49:34.168929 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:38.168914523 +0000 UTC m=+129.261756950 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : secret "metrics-daemon-secret" not found Apr 20 17:49:34.359767 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.359740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:34.362550 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.362529 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 17:49:34.372881 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.372862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 17:49:34.383472 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.383454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp4hc\" (UniqueName: \"kubernetes.io/projected/5d45e323-232c-48df-b245-96c179fcb1e3-kube-api-access-kp4hc\") pod \"network-check-target-drg8v\" (UID: \"5d45e323-232c-48df-b245-96c179fcb1e3\") " pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:34.460016 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.459962 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-swr7p\"" Apr 20 17:49:34.467667 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.467646 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:34.644091 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.644056 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-drg8v"] Apr 20 17:49:34.647798 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:49:34.647769 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d45e323_232c_48df_b245_96c179fcb1e3.slice/crio-fd0ec9fd4daea8ce658f5b26f5956ec9c44f9da67261a7103ab7ec274c026d3f WatchSource:0}: Error finding container fd0ec9fd4daea8ce658f5b26f5956ec9c44f9da67261a7103ab7ec274c026d3f: Status 404 returned error can't find the container with id fd0ec9fd4daea8ce658f5b26f5956ec9c44f9da67261a7103ab7ec274c026d3f Apr 20 17:49:34.880301 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:34.880273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-drg8v" event={"ID":"5d45e323-232c-48df-b245-96c179fcb1e3","Type":"ContainerStarted","Data":"fd0ec9fd4daea8ce658f5b26f5956ec9c44f9da67261a7103ab7ec274c026d3f"} Apr 20 17:49:37.887558 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:37.887528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-drg8v" event={"ID":"5d45e323-232c-48df-b245-96c179fcb1e3","Type":"ContainerStarted","Data":"5138f527660754f60f3e0f604862176e85ec54616bedfaab4b99dae948364783"} Apr 20 17:49:37.887920 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:37.887669 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:49:37.902384 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:49:37.902340 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-drg8v" podStartSLOduration=66.049240438 podStartE2EDuration="1m8.902326732s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:49:34.650120301 +0000 UTC m=+65.742962728" lastFinishedPulling="2026-04-20 17:49:37.50320658 +0000 UTC m=+68.596049022" observedRunningTime="2026-04-20 17:49:37.902217914 +0000 UTC m=+68.995060358" watchObservedRunningTime="2026-04-20 17:49:37.902326732 +0000 UTC m=+68.995169182" Apr 20 17:50:05.664778 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:05.664744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:50:05.664778 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:05.664781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:50:05.665205 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:05.664876 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 17:50:05.665205 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:05.664882 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 17:50:05.665205 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:05.664932 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert podName:ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3 nodeName:}" failed. No retries permitted until 2026-04-20 17:51:09.664916704 +0000 UTC m=+160.757759132 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert") pod "ingress-canary-kj29z" (UID: "ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3") : secret "canary-serving-cert" not found Apr 20 17:50:05.665205 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:05.664945 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls podName:4dc154c0-3907-4457-b37d-d6e899b39fff nodeName:}" failed. No retries permitted until 2026-04-20 17:51:09.664939124 +0000 UTC m=+160.757781551 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls") pod "dns-default-6kmcw" (UID: "4dc154c0-3907-4457-b37d-d6e899b39fff") : secret "dns-default-metrics-tls" not found Apr 20 17:50:08.892399 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:08.892369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-drg8v" Apr 20 17:50:38.178475 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:38.178431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:50:38.178964 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:38.178569 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 17:50:38.178964 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:38.178640 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs podName:07e151c2-7294-492d-b56b-1fc480d9ab69 nodeName:}" failed. No retries permitted until 2026-04-20 17:52:40.17862279 +0000 UTC m=+251.271465217 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs") pod "network-metrics-daemon-rlsjc" (UID: "07e151c2-7294-492d-b56b-1fc480d9ab69") : secret "metrics-daemon-secret" not found Apr 20 17:50:39.071057 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.071028 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t"] Apr 20 17:50:39.073633 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.073618 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" Apr 20 17:50:39.076447 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.076427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.076553 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.076492 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-rq2l6\"" Apr 20 17:50:39.077217 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.077201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.081270 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.081250 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7bbd7cc87d-kgrp2"] Apr 20 17:50:39.083875 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.083859 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-x9jbt"] Apr 20 17:50:39.083992 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.083977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.086441 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086422 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.086549 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086531 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 17:50:39.086595 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086539 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 17:50:39.086829 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086802 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.086916 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 17:50:39.086916 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 17:50:39.086916 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.087060 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.086812 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-6xqlr\"" Apr 20 17:50:39.091026 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.091011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 17:50:39.092456 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.092437 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t"] Apr 20 17:50:39.093647 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.093629 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.094479 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.094462 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 17:50:39.094555 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.094462 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.096342 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.096322 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ph4sh\"" Apr 20 17:50:39.108366 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.108343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bbd7cc87d-kgrp2"] Apr 20 17:50:39.109848 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.109834 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 17:50:39.127705 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.127663 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-x9jbt"] Apr 20 17:50:39.179445 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.179427 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c"] Apr 20 17:50:39.181942 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.181929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.183754 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlfcw\" (UniqueName: \"kubernetes.io/projected/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-kube-api-access-rlfcw\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.183846 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.183846 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wjgb\" (UniqueName: \"kubernetes.io/projected/9d65389c-a51a-4254-9236-b91567c36c0d-kube-api-access-4wjgb\") pod \"volume-data-source-validator-7c6cbb6c87-9pk2t\" (UID: \"9d65389c-a51a-4254-9236-b91567c36c0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" Apr 20 17:50:39.183846 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-default-certificate\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.183957 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.183957 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-config\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.183957 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdl2r\" (UniqueName: \"kubernetes.io/projected/7b844265-ed78-4d7b-ae2f-e0af244b29a2-kube-api-access-hdl2r\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.183957 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183954 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-stats-auth\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.184095 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b844265-ed78-4d7b-ae2f-e0af244b29a2-serving-cert\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.184095 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.183983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-trusted-ca\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.187763 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.187737 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:39.187864 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.187744 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 17:50:39.188018 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.187994 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zm9f9\"" Apr 20 17:50:39.188113 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.188016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 17:50:39.188877 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.188862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 17:50:39.200704 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.200648 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c"] Apr 20 17:50:39.285221 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac7cee7-8495-490d-92f9-c31987536747-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.285336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdl2r\" (UniqueName: \"kubernetes.io/projected/7b844265-ed78-4d7b-ae2f-e0af244b29a2-kube-api-access-hdl2r\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.285336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-stats-auth\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.285336 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b844265-ed78-4d7b-ae2f-e0af244b29a2-serving-cert\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.285498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-trusted-ca\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.285498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlfcw\" (UniqueName: \"kubernetes.io/projected/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-kube-api-access-rlfcw\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.285498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.285498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wjgb\" (UniqueName: \"kubernetes.io/projected/9d65389c-a51a-4254-9236-b91567c36c0d-kube-api-access-4wjgb\") pod \"volume-data-source-validator-7c6cbb6c87-9pk2t\" (UID: \"9d65389c-a51a-4254-9236-b91567c36c0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" Apr 20 17:50:39.285498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-default-certificate\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.285772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtljk\" (UniqueName: \"kubernetes.io/projected/7ac7cee7-8495-490d-92f9-c31987536747-kube-api-access-qtljk\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.285772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.285772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7cee7-8495-490d-92f9-c31987536747-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.285772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.285585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-config\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.285772 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.285723 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:39.785681636 +0000 UTC m=+130.878524081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:39.286027 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.285913 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 17:50:39.286027 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.285985 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:39.785965802 +0000 UTC m=+130.878808251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : secret "router-metrics-certs-default" not found Apr 20 17:50:39.286214 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.286195 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-config\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.286405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.286384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7b844265-ed78-4d7b-ae2f-e0af244b29a2-trusted-ca\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.287862 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.287835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b844265-ed78-4d7b-ae2f-e0af244b29a2-serving-cert\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.287949 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.287838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-default-certificate\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.287949 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.287885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-stats-auth\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.312268 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.312236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdl2r\" (UniqueName: \"kubernetes.io/projected/7b844265-ed78-4d7b-ae2f-e0af244b29a2-kube-api-access-hdl2r\") pod \"console-operator-9d4b6777b-x9jbt\" (UID: \"7b844265-ed78-4d7b-ae2f-e0af244b29a2\") " pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.315405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.315155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wjgb\" (UniqueName: \"kubernetes.io/projected/9d65389c-a51a-4254-9236-b91567c36c0d-kube-api-access-4wjgb\") pod \"volume-data-source-validator-7c6cbb6c87-9pk2t\" (UID: \"9d65389c-a51a-4254-9236-b91567c36c0d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" Apr 20 17:50:39.316890 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.316530 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlfcw\" (UniqueName: \"kubernetes.io/projected/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-kube-api-access-rlfcw\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.383155 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.383135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" Apr 20 17:50:39.385890 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.385870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtljk\" (UniqueName: \"kubernetes.io/projected/7ac7cee7-8495-490d-92f9-c31987536747-kube-api-access-qtljk\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.385945 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.385909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7cee7-8495-490d-92f9-c31987536747-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.385945 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.385930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac7cee7-8495-490d-92f9-c31987536747-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.387034 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.387008 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7cee7-8495-490d-92f9-c31987536747-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.388200 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.388182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ac7cee7-8495-490d-92f9-c31987536747-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.395084 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.395064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtljk\" (UniqueName: \"kubernetes.io/projected/7ac7cee7-8495-490d-92f9-c31987536747-kube-api-access-qtljk\") pod \"kube-storage-version-migrator-operator-6769c5d45-wtp2c\" (UID: \"7ac7cee7-8495-490d-92f9-c31987536747\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.399019 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.398999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:39.493264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.493232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" Apr 20 17:50:39.513342 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.513319 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t"] Apr 20 17:50:39.515590 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:50:39.515564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d65389c_a51a_4254_9236_b91567c36c0d.slice/crio-5d328f260d5c63199f694060c90d3d28b1e3cfe466e6ca30d29633aefebda245 WatchSource:0}: Error finding container 5d328f260d5c63199f694060c90d3d28b1e3cfe466e6ca30d29633aefebda245: Status 404 returned error can't find the container with id 5d328f260d5c63199f694060c90d3d28b1e3cfe466e6ca30d29633aefebda245 Apr 20 17:50:39.529937 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.529906 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-x9jbt"] Apr 20 17:50:39.532766 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:50:39.532742 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b844265_ed78_4d7b_ae2f_e0af244b29a2.slice/crio-d194ee9ef2687ec96f1a9970282aedbcb0e54cc07e7e9957b1b0175c62f7a1ac WatchSource:0}: Error finding container d194ee9ef2687ec96f1a9970282aedbcb0e54cc07e7e9957b1b0175c62f7a1ac: Status 404 returned error can't find the container with id d194ee9ef2687ec96f1a9970282aedbcb0e54cc07e7e9957b1b0175c62f7a1ac Apr 20 17:50:39.621976 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.621946 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c"] Apr 20 17:50:39.625625 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:50:39.625599 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ac7cee7_8495_490d_92f9_c31987536747.slice/crio-3503195993b0d45cebf8c1895edc32ad4920d66a19f828cf5f6e51f587b7a656 WatchSource:0}: Error finding container 3503195993b0d45cebf8c1895edc32ad4920d66a19f828cf5f6e51f587b7a656: Status 404 returned error can't find the container with id 3503195993b0d45cebf8c1895edc32ad4920d66a19f828cf5f6e51f587b7a656 Apr 20 17:50:39.789234 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.789159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.789234 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:39.789209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:39.789400 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.789303 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 17:50:39.789400 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.789371 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:40.789350908 +0000 UTC m=+131.882193338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : secret "router-metrics-certs-default" not found Apr 20 17:50:39.789400 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:39.789385 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:40.789379611 +0000 UTC m=+131.882222039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:40.002666 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:40.002629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" event={"ID":"7b844265-ed78-4d7b-ae2f-e0af244b29a2","Type":"ContainerStarted","Data":"d194ee9ef2687ec96f1a9970282aedbcb0e54cc07e7e9957b1b0175c62f7a1ac"} Apr 20 17:50:40.003411 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:40.003382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" event={"ID":"7ac7cee7-8495-490d-92f9-c31987536747","Type":"ContainerStarted","Data":"3503195993b0d45cebf8c1895edc32ad4920d66a19f828cf5f6e51f587b7a656"} Apr 20 17:50:40.004254 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:40.004237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" event={"ID":"9d65389c-a51a-4254-9236-b91567c36c0d","Type":"ContainerStarted","Data":"5d328f260d5c63199f694060c90d3d28b1e3cfe466e6ca30d29633aefebda245"} Apr 20 17:50:40.798772 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:40.798733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:40.799180 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:40.798937 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:42.798917643 +0000 UTC m=+133.891760096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:40.799180 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:40.799015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:40.799180 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:40.799148 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 17:50:40.799346 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:40.799196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:42.799181578 +0000 UTC m=+133.892024012 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : secret "router-metrics-certs-default" not found Apr 20 17:50:42.013413 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:42.013372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" event={"ID":"9d65389c-a51a-4254-9236-b91567c36c0d","Type":"ContainerStarted","Data":"cd98fd3fa8df24b235ed62fc86bd14eff81042431696ab61fbc2edf101681863"} Apr 20 17:50:42.030373 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:42.030313 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-9pk2t" podStartSLOduration=1.414234622 podStartE2EDuration="3.030296137s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:39.517290042 +0000 UTC m=+130.610132470" lastFinishedPulling="2026-04-20 17:50:41.133351552 +0000 UTC m=+132.226193985" observedRunningTime="2026-04-20 17:50:42.029721258 +0000 UTC m=+133.122563708" watchObservedRunningTime="2026-04-20 17:50:42.030296137 +0000 UTC m=+133.123138584" Apr 20 17:50:42.816571 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:42.816539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:42.816756 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:42.816644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:42.816756 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:42.816683 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:46.816667141 +0000 UTC m=+137.909509569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:42.816893 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:42.816760 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 17:50:42.816893 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:42.816808 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:46.816793062 +0000 UTC m=+137.909635491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : secret "router-metrics-certs-default" not found Apr 20 17:50:43.016375 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.016338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" event={"ID":"7ac7cee7-8495-490d-92f9-c31987536747","Type":"ContainerStarted","Data":"3dd8cd53ced89b27ea639e84f2ea57d68006145fcd84e9385a05b856c550f0ef"} Apr 20 17:50:43.017741 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.017721 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/0.log" Apr 20 17:50:43.017834 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.017755 2575 generic.go:358] "Generic (PLEG): container finished" podID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" containerID="e40633692964c57fcb9b0cdaae345a6c3a397fe7e752d94ce34ee0e9de215f59" exitCode=255 Apr 20 17:50:43.017834 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.017827 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" event={"ID":"7b844265-ed78-4d7b-ae2f-e0af244b29a2","Type":"ContainerDied","Data":"e40633692964c57fcb9b0cdaae345a6c3a397fe7e752d94ce34ee0e9de215f59"} Apr 20 17:50:43.017983 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.017970 2575 scope.go:117] "RemoveContainer" containerID="e40633692964c57fcb9b0cdaae345a6c3a397fe7e752d94ce34ee0e9de215f59" Apr 20 17:50:43.034235 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:43.034187 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" podStartSLOduration=1.453262745 podStartE2EDuration="4.034171817s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:39.627406271 +0000 UTC m=+130.720248702" lastFinishedPulling="2026-04-20 17:50:42.208315346 +0000 UTC m=+133.301157774" observedRunningTime="2026-04-20 17:50:43.033338852 +0000 UTC m=+134.126181300" watchObservedRunningTime="2026-04-20 17:50:43.034171817 +0000 UTC m=+134.127014268" Apr 20 17:50:44.021522 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.021494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/1.log" Apr 20 17:50:44.022017 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.021871 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/0.log" Apr 20 17:50:44.022017 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.021903 2575 generic.go:358] "Generic (PLEG): container finished" podID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" exitCode=255 Apr 20 17:50:44.022017 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.021990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" event={"ID":"7b844265-ed78-4d7b-ae2f-e0af244b29a2","Type":"ContainerDied","Data":"179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58"} Apr 20 17:50:44.022121 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.022034 2575 scope.go:117] "RemoveContainer" containerID="e40633692964c57fcb9b0cdaae345a6c3a397fe7e752d94ce34ee0e9de215f59" Apr 20 17:50:44.022260 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.022240 2575 scope.go:117] "RemoveContainer" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" Apr 20 17:50:44.022480 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:44.022443 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-x9jbt_openshift-console-operator(7b844265-ed78-4d7b-ae2f-e0af244b29a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podUID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" Apr 20 17:50:44.308735 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.308648 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg"] Apr 20 17:50:44.311845 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.311820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" Apr 20 17:50:44.314830 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.314807 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 17:50:44.314948 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.314852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 17:50:44.315005 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.314957 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-dxvfw\"" Apr 20 17:50:44.326060 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.326041 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg"] Apr 20 17:50:44.427803 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.427779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8h6w\" (UniqueName: \"kubernetes.io/projected/7cf1c858-dc4d-4fc1-8868-b98320883062-kube-api-access-f8h6w\") pod \"migrator-74bb7799d9-444hg\" (UID: \"7cf1c858-dc4d-4fc1-8868-b98320883062\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" Apr 20 17:50:44.528622 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.528588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8h6w\" (UniqueName: \"kubernetes.io/projected/7cf1c858-dc4d-4fc1-8868-b98320883062-kube-api-access-f8h6w\") pod \"migrator-74bb7799d9-444hg\" (UID: \"7cf1c858-dc4d-4fc1-8868-b98320883062\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" Apr 20 17:50:44.536816 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.536796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8h6w\" (UniqueName: \"kubernetes.io/projected/7cf1c858-dc4d-4fc1-8868-b98320883062-kube-api-access-f8h6w\") pod \"migrator-74bb7799d9-444hg\" (UID: \"7cf1c858-dc4d-4fc1-8868-b98320883062\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" Apr 20 17:50:44.620750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.620715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" Apr 20 17:50:44.739721 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:44.738827 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg"] Apr 20 17:50:44.742701 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:50:44.742663 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cf1c858_dc4d_4fc1_8868_b98320883062.slice/crio-9524274b690b09296213a613e69d5cafc0e1ccf024dcc98d04560a4937140261 WatchSource:0}: Error finding container 9524274b690b09296213a613e69d5cafc0e1ccf024dcc98d04560a4937140261: Status 404 returned error can't find the container with id 9524274b690b09296213a613e69d5cafc0e1ccf024dcc98d04560a4937140261 Apr 20 17:50:45.024831 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:45.024759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/1.log" Apr 20 17:50:45.025231 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:45.025111 2575 scope.go:117] "RemoveContainer" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" Apr 20 17:50:45.025333 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:45.025314 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-x9jbt_openshift-console-operator(7b844265-ed78-4d7b-ae2f-e0af244b29a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podUID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" Apr 20 17:50:45.025785 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:45.025763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" event={"ID":"7cf1c858-dc4d-4fc1-8868-b98320883062","Type":"ContainerStarted","Data":"9524274b690b09296213a613e69d5cafc0e1ccf024dcc98d04560a4937140261"} Apr 20 17:50:45.970184 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:45.970161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5gv96_70b177d7-2721-4fb0-85f6-0e4da108fdaf/dns-node-resolver/0.log" Apr 20 17:50:46.365121 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.365095 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-565k9"] Apr 20 17:50:46.367837 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.367823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.371768 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.371746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-v2jnv\"" Apr 20 17:50:46.371876 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.371800 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 17:50:46.372784 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.372759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 17:50:46.372861 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.372780 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 17:50:46.372861 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.372837 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 17:50:46.383007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.382986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-565k9"] Apr 20 17:50:46.445410 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.445388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-cabundle\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.445513 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.445423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26tlh\" (UniqueName: \"kubernetes.io/projected/a3220ef0-0f9a-49c9-a5a3-9add232540d3-kube-api-access-26tlh\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.445513 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.445467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-key\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.545984 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.545922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-key\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.545984 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.545974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-cabundle\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.546112 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.546097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26tlh\" (UniqueName: \"kubernetes.io/projected/a3220ef0-0f9a-49c9-a5a3-9add232540d3-kube-api-access-26tlh\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.546543 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.546519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-cabundle\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.548330 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.548310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a3220ef0-0f9a-49c9-a5a3-9add232540d3-signing-key\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.554953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.554937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26tlh\" (UniqueName: \"kubernetes.io/projected/a3220ef0-0f9a-49c9-a5a3-9add232540d3-kube-api-access-26tlh\") pod \"service-ca-865cb79987-565k9\" (UID: \"a3220ef0-0f9a-49c9-a5a3-9add232540d3\") " pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.676656 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.676627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-565k9" Apr 20 17:50:46.788263 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.788226 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-565k9"] Apr 20 17:50:46.791679 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:50:46.791651 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3220ef0_0f9a_49c9_a5a3_9add232540d3.slice/crio-c961b2039ab6b8a6d211c50a6c1e2293066f80c6f3b03f2b5ce10ab6f54ade7b WatchSource:0}: Error finding container c961b2039ab6b8a6d211c50a6c1e2293066f80c6f3b03f2b5ce10ab6f54ade7b: Status 404 returned error can't find the container with id c961b2039ab6b8a6d211c50a6c1e2293066f80c6f3b03f2b5ce10ab6f54ade7b Apr 20 17:50:46.848110 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.848041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:46.848244 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:46.848151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:46.848244 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:46.848213 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:54.848191642 +0000 UTC m=+145.941034069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:46.848367 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:46.848273 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 17:50:46.848367 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:46.848330 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:50:54.848314577 +0000 UTC m=+145.941157008 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : secret "router-metrics-certs-default" not found Apr 20 17:50:47.032159 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:47.032114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" event={"ID":"7cf1c858-dc4d-4fc1-8868-b98320883062","Type":"ContainerStarted","Data":"ae12cb331fa239530c74e30cf10b3c760cea0ad304304ba43ed7d5b3911b68de"} Apr 20 17:50:47.032159 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:47.032164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" event={"ID":"7cf1c858-dc4d-4fc1-8868-b98320883062","Type":"ContainerStarted","Data":"e66cfdca61198fa1acddb2d33e5fe674cd5d64a882d0bc74634c5ca226cfa7e7"} Apr 20 17:50:47.033042 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:47.033020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-565k9" event={"ID":"a3220ef0-0f9a-49c9-a5a3-9add232540d3","Type":"ContainerStarted","Data":"c961b2039ab6b8a6d211c50a6c1e2293066f80c6f3b03f2b5ce10ab6f54ade7b"} Apr 20 17:50:47.058623 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:47.058572 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-444hg" podStartSLOduration=1.605999685 podStartE2EDuration="3.058561146s" podCreationTimestamp="2026-04-20 17:50:44 +0000 UTC" firstStartedPulling="2026-04-20 17:50:44.744958158 +0000 UTC m=+135.837800587" lastFinishedPulling="2026-04-20 17:50:46.197519618 +0000 UTC m=+137.290362048" observedRunningTime="2026-04-20 17:50:47.056097753 +0000 UTC m=+138.148940203" watchObservedRunningTime="2026-04-20 17:50:47.058561146 +0000 UTC m=+138.151403592" Apr 20 17:50:47.372452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:47.372427 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xgfmk_799ea967-f6fc-4097-8bbf-38dcdeaa4107/node-ca/0.log" Apr 20 17:50:49.040620 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:49.040584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-565k9" event={"ID":"a3220ef0-0f9a-49c9-a5a3-9add232540d3","Type":"ContainerStarted","Data":"0082380b49519d80a0beb498125bc07a776c0f2b246d73dbc9e9145a56145025"} Apr 20 17:50:49.058240 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:49.058189 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-565k9" podStartSLOduration=1.485116882 podStartE2EDuration="3.058175294s" podCreationTimestamp="2026-04-20 17:50:46 +0000 UTC" firstStartedPulling="2026-04-20 17:50:46.793465061 +0000 UTC m=+137.886307489" lastFinishedPulling="2026-04-20 17:50:48.366523473 +0000 UTC m=+139.459365901" observedRunningTime="2026-04-20 17:50:49.057610231 +0000 UTC m=+140.150452687" watchObservedRunningTime="2026-04-20 17:50:49.058175294 +0000 UTC m=+140.151017744" Apr 20 17:50:49.399492 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:49.399464 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:49.399492 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:49.399500 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:50:49.399856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:49.399840 2575 scope.go:117] "RemoveContainer" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" Apr 20 17:50:49.400002 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:49.399987 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-x9jbt_openshift-console-operator(7b844265-ed78-4d7b-ae2f-e0af244b29a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podUID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" Apr 20 17:50:54.912271 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:54.912243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:54.912728 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:54.912291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:50:54.912728 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:50:54.912428 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle podName:ca90dcc0-b07c-42a3-92a5-557d7ae40ea7 nodeName:}" failed. No retries permitted until 2026-04-20 17:51:10.91240797 +0000 UTC m=+162.005250400 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle") pod "router-default-7bbd7cc87d-kgrp2" (UID: "ca90dcc0-b07c-42a3-92a5-557d7ae40ea7") : configmap references non-existent config key: service-ca.crt Apr 20 17:50:54.914422 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:50:54.914397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-metrics-certs\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:00.543488 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:00.543454 2575 scope.go:117] "RemoveContainer" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" Apr 20 17:51:01.070561 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.070531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:51:01.070922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.070908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/1.log" Apr 20 17:51:01.070975 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.070939 2575 generic.go:358] "Generic (PLEG): container finished" podID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" containerID="7d1a369c0d2f1f32f2a1e655befd0a915aea1dc6820eeede9e27868760a34bc3" exitCode=255 Apr 20 17:51:01.071009 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.070984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" event={"ID":"7b844265-ed78-4d7b-ae2f-e0af244b29a2","Type":"ContainerDied","Data":"7d1a369c0d2f1f32f2a1e655befd0a915aea1dc6820eeede9e27868760a34bc3"} Apr 20 17:51:01.071042 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.071011 2575 scope.go:117] "RemoveContainer" containerID="179e971be86b9725cf8d012cb32490aabd49e0707fe867e37b076ff816ea5c58" Apr 20 17:51:01.071405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:01.071388 2575 scope.go:117] "RemoveContainer" containerID="7d1a369c0d2f1f32f2a1e655befd0a915aea1dc6820eeede9e27868760a34bc3" Apr 20 17:51:01.071598 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:01.071577 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-x9jbt_openshift-console-operator(7b844265-ed78-4d7b-ae2f-e0af244b29a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podUID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" Apr 20 17:51:02.074718 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:02.074674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:51:04.771365 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:04.771320 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6kmcw" podUID="4dc154c0-3907-4457-b37d-d6e899b39fff" Apr 20 17:51:04.782576 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:04.782551 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-kj29z" podUID="ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3" Apr 20 17:51:05.081109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:05.081035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:06.553270 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:06.553234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rlsjc" podUID="07e151c2-7294-492d-b56b-1fc480d9ab69" Apr 20 17:51:07.094894 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.094863 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-985f6489d-cxh24"] Apr 20 17:51:07.130740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.130713 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-985f6489d-cxh24"] Apr 20 17:51:07.130883 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.130777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.135424 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.135400 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 17:51:07.135540 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.135401 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 17:51:07.136946 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.136929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 17:51:07.137434 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.137418 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gt955\"" Apr 20 17:51:07.142966 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.142948 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 17:51:07.146366 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.146350 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4bjjc"] Apr 20 17:51:07.170154 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.170136 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4bjjc"] Apr 20 17:51:07.170258 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.170249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.173071 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.173052 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 17:51:07.173471 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.173451 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 17:51:07.173576 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.173478 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 17:51:07.173678 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.173648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 17:51:07.173819 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.173804 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvbdx\"" Apr 20 17:51:07.298329 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-trusted-ca\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-ca-trust-extracted\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-installation-pull-secrets\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298581 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-data-volume\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.298581 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298505 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.298581 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-bound-sa-token\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-image-registry-private-configuration\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-certificates\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298654 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-crio-socket\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.298769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-tls\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.298965 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl62b\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-kube-api-access-tl62b\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.298965 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.298805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpxtq\" (UniqueName: \"kubernetes.io/projected/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-api-access-lpxtq\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399372 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-data-volume\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-bound-sa-token\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-image-registry-private-configuration\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-certificates\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-crio-socket\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-tls\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399504 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399502 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl62b\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-kube-api-access-tl62b\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpxtq\" (UniqueName: \"kubernetes.io/projected/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-api-access-lpxtq\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-trusted-ca\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-ca-trust-extracted\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-installation-pull-secrets\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.399871 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.399746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-data-volume\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.400190 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.400072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.400353 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.400296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-certificates\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.400463 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.400399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-crio-socket\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.400836 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.400813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-ca-trust-extracted\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.400953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.400936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-trusted-ca\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.402046 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.402024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-image-registry-private-configuration\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.402144 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.402109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-installation-pull-secrets\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.402369 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.402350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.402706 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.402668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-registry-tls\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.410090 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.410070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-bound-sa-token\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.410159 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.410134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl62b\" (UniqueName: \"kubernetes.io/projected/3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4-kube-api-access-tl62b\") pod \"image-registry-985f6489d-cxh24\" (UID: \"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4\") " pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.410625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.410605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpxtq\" (UniqueName: \"kubernetes.io/projected/b3c0232c-d2c7-4e1e-bf28-9713d04a4895-kube-api-access-lpxtq\") pod \"insights-runtime-extractor-4bjjc\" (UID: \"b3c0232c-d2c7-4e1e-bf28-9713d04a4895\") " pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.442477 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.442456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:07.479244 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.479214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4bjjc" Apr 20 17:51:07.567889 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.567863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-985f6489d-cxh24"] Apr 20 17:51:07.571083 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:07.571059 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d171c0a_dae9_40e6_afaf_3a8d5eb4a6e4.slice/crio-2ce0b1643622a9d0fe7c3e6671886e05a92b46d758fd91567b3d134cbaf110b8 WatchSource:0}: Error finding container 2ce0b1643622a9d0fe7c3e6671886e05a92b46d758fd91567b3d134cbaf110b8: Status 404 returned error can't find the container with id 2ce0b1643622a9d0fe7c3e6671886e05a92b46d758fd91567b3d134cbaf110b8 Apr 20 17:51:07.603577 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:07.603556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4bjjc"] Apr 20 17:51:07.610609 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:07.610585 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c0232c_d2c7_4e1e_bf28_9713d04a4895.slice/crio-269b958dddb041e8c621fbdf8e35b081fa58254d1a5cf86452aba5d21b38dcf0 WatchSource:0}: Error finding container 269b958dddb041e8c621fbdf8e35b081fa58254d1a5cf86452aba5d21b38dcf0: Status 404 returned error can't find the container with id 269b958dddb041e8c621fbdf8e35b081fa58254d1a5cf86452aba5d21b38dcf0 Apr 20 17:51:08.089864 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.089782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bjjc" event={"ID":"b3c0232c-d2c7-4e1e-bf28-9713d04a4895","Type":"ContainerStarted","Data":"428b0cd9c119d9699dfc3fe79ce6d4ac6ecd951873cbf5f2f0cbce170526a603"} Apr 20 17:51:08.089864 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.089816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bjjc" event={"ID":"b3c0232c-d2c7-4e1e-bf28-9713d04a4895","Type":"ContainerStarted","Data":"269b958dddb041e8c621fbdf8e35b081fa58254d1a5cf86452aba5d21b38dcf0"} Apr 20 17:51:08.090929 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.090907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-985f6489d-cxh24" event={"ID":"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4","Type":"ContainerStarted","Data":"cb0625201c191a0a2a215f5bed9e663238a74110148becd5d7f0ac70249eb5fa"} Apr 20 17:51:08.091021 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.090935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-985f6489d-cxh24" event={"ID":"3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4","Type":"ContainerStarted","Data":"2ce0b1643622a9d0fe7c3e6671886e05a92b46d758fd91567b3d134cbaf110b8"} Apr 20 17:51:08.091058 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.091050 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:08.109056 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:08.109020 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-985f6489d-cxh24" podStartSLOduration=1.1090085570000001 podStartE2EDuration="1.109008557s" podCreationTimestamp="2026-04-20 17:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:51:08.108544888 +0000 UTC m=+159.201387339" watchObservedRunningTime="2026-04-20 17:51:08.109008557 +0000 UTC m=+159.201851009" Apr 20 17:51:09.096284 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.096169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bjjc" event={"ID":"b3c0232c-d2c7-4e1e-bf28-9713d04a4895","Type":"ContainerStarted","Data":"15f0e91e3949d0016d9da6ffe924327b05208f09408377f4a440d6db5809b12a"} Apr 20 17:51:09.399559 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.399531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:51:09.399559 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.399563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:51:09.399960 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.399943 2575 scope.go:117] "RemoveContainer" containerID="7d1a369c0d2f1f32f2a1e655befd0a915aea1dc6820eeede9e27868760a34bc3" Apr 20 17:51:09.400160 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:09.400134 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-x9jbt_openshift-console-operator(7b844265-ed78-4d7b-ae2f-e0af244b29a2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podUID="7b844265-ed78-4d7b-ae2f-e0af244b29a2" Apr 20 17:51:09.718834 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.718740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:09.718834 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.718789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:51:09.721332 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.721292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4dc154c0-3907-4457-b37d-d6e899b39fff-metrics-tls\") pod \"dns-default-6kmcw\" (UID: \"4dc154c0-3907-4457-b37d-d6e899b39fff\") " pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:09.721529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.721510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3-cert\") pod \"ingress-canary-kj29z\" (UID: \"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3\") " pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:51:09.884116 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.884086 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-82cw8\"" Apr 20 17:51:09.892213 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:09.892197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:10.007480 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.007342 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6kmcw"] Apr 20 17:51:10.009678 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:10.009652 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc154c0_3907_4457_b37d_d6e899b39fff.slice/crio-07a0e0865d3f62dfe5500b5116cf7f702d82d392fdffd05a89bc6f4677bbe32e WatchSource:0}: Error finding container 07a0e0865d3f62dfe5500b5116cf7f702d82d392fdffd05a89bc6f4677bbe32e: Status 404 returned error can't find the container with id 07a0e0865d3f62dfe5500b5116cf7f702d82d392fdffd05a89bc6f4677bbe32e Apr 20 17:51:10.100332 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.100298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4bjjc" event={"ID":"b3c0232c-d2c7-4e1e-bf28-9713d04a4895","Type":"ContainerStarted","Data":"b7f95490b5c06bdbe51aefbb6021779ce5e7c199bc7aad4e4fb102023a597597"} Apr 20 17:51:10.101330 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.101298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmcw" event={"ID":"4dc154c0-3907-4457-b37d-d6e899b39fff","Type":"ContainerStarted","Data":"07a0e0865d3f62dfe5500b5116cf7f702d82d392fdffd05a89bc6f4677bbe32e"} Apr 20 17:51:10.118107 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.118064 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4bjjc" podStartSLOduration=0.896640565 podStartE2EDuration="3.118050408s" podCreationTimestamp="2026-04-20 17:51:07 +0000 UTC" firstStartedPulling="2026-04-20 17:51:07.666796945 +0000 UTC m=+158.759639373" lastFinishedPulling="2026-04-20 17:51:09.888206788 +0000 UTC m=+160.981049216" observedRunningTime="2026-04-20 17:51:10.117372088 +0000 UTC m=+161.210214537" watchObservedRunningTime="2026-04-20 17:51:10.118050408 +0000 UTC m=+161.210892915" Apr 20 17:51:10.926993 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.926954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:10.928320 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:10.928294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca90dcc0-b07c-42a3-92a5-557d7ae40ea7-service-ca-bundle\") pod \"router-default-7bbd7cc87d-kgrp2\" (UID: \"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7\") " pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:11.193785 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:11.193705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:11.327705 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:11.327650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7bbd7cc87d-kgrp2"] Apr 20 17:51:11.331805 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:11.331778 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca90dcc0_b07c_42a3_92a5_557d7ae40ea7.slice/crio-5eff96446b85ca8c42deaff189d46e8bc2d404da56413d8cb0109206a204a0ac WatchSource:0}: Error finding container 5eff96446b85ca8c42deaff189d46e8bc2d404da56413d8cb0109206a204a0ac: Status 404 returned error can't find the container with id 5eff96446b85ca8c42deaff189d46e8bc2d404da56413d8cb0109206a204a0ac Apr 20 17:51:12.108099 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.108064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmcw" event={"ID":"4dc154c0-3907-4457-b37d-d6e899b39fff","Type":"ContainerStarted","Data":"b62f7fe10f190b33a899984fe9024e6a10d1d90582e79b17cdb406a0429d3b38"} Apr 20 17:51:12.108099 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.108103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6kmcw" event={"ID":"4dc154c0-3907-4457-b37d-d6e899b39fff","Type":"ContainerStarted","Data":"49d0a758d82e8f92f094dd4e0dba665074159801ebeb0ececc53c88a8b5dfb11"} Apr 20 17:51:12.108384 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.108179 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:12.109381 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.109361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" event={"ID":"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7","Type":"ContainerStarted","Data":"769063a0a7ea26935941479731196bbad0a0ae18fd9540be8a91fe60a371f2a7"} Apr 20 17:51:12.109381 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.109383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" event={"ID":"ca90dcc0-b07c-42a3-92a5-557d7ae40ea7","Type":"ContainerStarted","Data":"5eff96446b85ca8c42deaff189d46e8bc2d404da56413d8cb0109206a204a0ac"} Apr 20 17:51:12.125016 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.124976 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6kmcw" podStartSLOduration=129.881779804 podStartE2EDuration="2m11.124965518s" podCreationTimestamp="2026-04-20 17:49:01 +0000 UTC" firstStartedPulling="2026-04-20 17:51:10.011396892 +0000 UTC m=+161.104239320" lastFinishedPulling="2026-04-20 17:51:11.254582597 +0000 UTC m=+162.347425034" observedRunningTime="2026-04-20 17:51:12.123424073 +0000 UTC m=+163.216266535" watchObservedRunningTime="2026-04-20 17:51:12.124965518 +0000 UTC m=+163.217807968" Apr 20 17:51:12.140165 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.140129 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" podStartSLOduration=33.140117784 podStartE2EDuration="33.140117784s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:51:12.139064307 +0000 UTC m=+163.231906759" watchObservedRunningTime="2026-04-20 17:51:12.140117784 +0000 UTC m=+163.232960234" Apr 20 17:51:12.194825 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.194800 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:12.197123 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:12.197106 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:13.113746 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:13.113717 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:13.114958 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:13.114938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7bbd7cc87d-kgrp2" Apr 20 17:51:17.543937 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:17.543898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:51:20.543427 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:20.543348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:51:20.546115 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:20.546093 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gkt2t\"" Apr 20 17:51:20.554322 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:20.554303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kj29z" Apr 20 17:51:20.665724 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:20.665676 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kj29z"] Apr 20 17:51:20.668614 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:20.668590 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff21a6d9_bc28_45e4_96ea_9f89c9bc1ce3.slice/crio-3ae5306aa879b4c3e50d8703ba7727b1010f2031d9cb5baf48cce57828a5a0d3 WatchSource:0}: Error finding container 3ae5306aa879b4c3e50d8703ba7727b1010f2031d9cb5baf48cce57828a5a0d3: Status 404 returned error can't find the container with id 3ae5306aa879b4c3e50d8703ba7727b1010f2031d9cb5baf48cce57828a5a0d3 Apr 20 17:51:21.132495 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:21.132457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kj29z" event={"ID":"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3","Type":"ContainerStarted","Data":"3ae5306aa879b4c3e50d8703ba7727b1010f2031d9cb5baf48cce57828a5a0d3"} Apr 20 17:51:21.543574 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:21.543499 2575 scope.go:117] "RemoveContainer" containerID="7d1a369c0d2f1f32f2a1e655befd0a915aea1dc6820eeede9e27868760a34bc3" Apr 20 17:51:22.116653 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.116622 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6kmcw" Apr 20 17:51:22.139279 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.138403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:51:22.139279 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.138470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" event={"ID":"7b844265-ed78-4d7b-ae2f-e0af244b29a2","Type":"ContainerStarted","Data":"d5103bfaacf7c6223c4d321f79ecca18b4e014b643a7ef99fa7c4344c53094b0"} Apr 20 17:51:22.139279 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.139231 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:51:22.179003 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.178791 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" podStartSLOduration=40.507414495 podStartE2EDuration="43.178772569s" podCreationTimestamp="2026-04-20 17:50:39 +0000 UTC" firstStartedPulling="2026-04-20 17:50:39.534367038 +0000 UTC m=+130.627209466" lastFinishedPulling="2026-04-20 17:50:42.205725108 +0000 UTC m=+133.298567540" observedRunningTime="2026-04-20 17:51:22.17763854 +0000 UTC m=+173.270480991" watchObservedRunningTime="2026-04-20 17:51:22.178772569 +0000 UTC m=+173.271615020" Apr 20 17:51:22.233135 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.232934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-x9jbt" Apr 20 17:51:22.238083 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.237500 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kvg82"] Apr 20 17:51:22.242295 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.242263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.244678 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.244654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 17:51:22.246284 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246254 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 17:51:22.246469 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246453 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 17:51:22.246573 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 17:51:22.246681 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-r82kb\"" Apr 20 17:51:22.246826 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 17:51:22.246935 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.246863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 17:51:22.266197 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.266162 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kvg82"] Apr 20 17:51:22.269130 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.269105 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qmh2c"] Apr 20 17:51:22.272820 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.272799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.275677 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.275653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 17:51:22.276018 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.275990 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 17:51:22.276288 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.276267 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-75n4m\"" Apr 20 17:51:22.276372 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.276311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 17:51:22.314228 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314228 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-sys\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-metrics-client-ca\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4h9b\" (UniqueName: \"kubernetes.io/projected/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-kube-api-access-s4h9b\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314450 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds7td\" (UniqueName: \"kubernetes.io/projected/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-api-access-ds7td\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-wtmp\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-root\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.314750 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.314678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-textfile\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416040 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-root\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416178 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416178 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-textfile\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416178 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416178 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-root\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416178 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-sys\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416356 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416356 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416356 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-metrics-client-ca\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416356 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4h9b\" (UniqueName: \"kubernetes.io/projected/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-kube-api-access-s4h9b\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416356 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416519 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416519 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416393 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds7td\" (UniqueName: \"kubernetes.io/projected/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-api-access-ds7td\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416519 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416519 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-wtmp\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.416519 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.416696 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.416519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.417159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-metrics-client-ca\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.417291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-wtmp\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.417314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.417520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-textfile\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.417569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-sys\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:22.417637 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:22.417710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls podName:ad44e0ba-0906-4c2e-a55b-edba31d4f5df nodeName:}" failed. No retries permitted until 2026-04-20 17:51:22.91767169 +0000 UTC m=+174.010514133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls") pod "node-exporter-qmh2c" (UID: "ad44e0ba-0906-4c2e-a55b-edba31d4f5df") : secret "node-exporter-tls" not found Apr 20 17:51:22.418208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.418148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.418673 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.418577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-accelerators-collector-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.419349 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.419299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.420042 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.420014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.420164 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.420146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.438364 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.438343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds7td\" (UniqueName: \"kubernetes.io/projected/33f6c414-d3f6-4ff7-b22e-e998f3e790bf-kube-api-access-ds7td\") pod \"kube-state-metrics-69db897b98-kvg82\" (UID: \"33f6c414-d3f6-4ff7-b22e-e998f3e790bf\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.438787 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.438739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4h9b\" (UniqueName: \"kubernetes.io/projected/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-kube-api-access-s4h9b\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.554030 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.553998 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" Apr 20 17:51:22.692037 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.692004 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-kvg82"] Apr 20 17:51:22.695396 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:22.695363 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f6c414_d3f6_4ff7_b22e_e998f3e790bf.slice/crio-7ef4c3d3ea6af7bbf55e493e38a74c1b0b9548ab26f1d30fab3eb4672c10cef2 WatchSource:0}: Error finding container 7ef4c3d3ea6af7bbf55e493e38a74c1b0b9548ab26f1d30fab3eb4672c10cef2: Status 404 returned error can't find the container with id 7ef4c3d3ea6af7bbf55e493e38a74c1b0b9548ab26f1d30fab3eb4672c10cef2 Apr 20 17:51:22.920855 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:22.920772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:22.921024 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:22.920943 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 17:51:22.921090 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:22.921025 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls podName:ad44e0ba-0906-4c2e-a55b-edba31d4f5df nodeName:}" failed. No retries permitted until 2026-04-20 17:51:23.921004076 +0000 UTC m=+175.013846509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls") pod "node-exporter-qmh2c" (UID: "ad44e0ba-0906-4c2e-a55b-edba31d4f5df") : secret "node-exporter-tls" not found Apr 20 17:51:23.143045 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:23.143007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kj29z" event={"ID":"ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3","Type":"ContainerStarted","Data":"48067649da673bb690a0adba46ea15481b73959f0c645e7e3767672f856c890b"} Apr 20 17:51:23.144149 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:23.144119 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" event={"ID":"33f6c414-d3f6-4ff7-b22e-e998f3e790bf","Type":"ContainerStarted","Data":"7ef4c3d3ea6af7bbf55e493e38a74c1b0b9548ab26f1d30fab3eb4672c10cef2"} Apr 20 17:51:23.160801 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:23.160761 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kj29z" podStartSLOduration=140.459073682 podStartE2EDuration="2m22.160739319s" podCreationTimestamp="2026-04-20 17:49:01 +0000 UTC" firstStartedPulling="2026-04-20 17:51:20.670481465 +0000 UTC m=+171.763323893" lastFinishedPulling="2026-04-20 17:51:22.372147085 +0000 UTC m=+173.464989530" observedRunningTime="2026-04-20 17:51:23.159871377 +0000 UTC m=+174.252713829" watchObservedRunningTime="2026-04-20 17:51:23.160739319 +0000 UTC m=+174.253581771" Apr 20 17:51:23.931317 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:23.931281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:23.933874 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:23.933847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ad44e0ba-0906-4c2e-a55b-edba31d4f5df-node-exporter-tls\") pod \"node-exporter-qmh2c\" (UID: \"ad44e0ba-0906-4c2e-a55b-edba31d4f5df\") " pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:24.086878 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:24.086857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qmh2c" Apr 20 17:51:24.148357 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:24.148324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" event={"ID":"33f6c414-d3f6-4ff7-b22e-e998f3e790bf","Type":"ContainerStarted","Data":"ce5da26bcfa725e30001466e4c5c3db3a497fbf4c428f2f5bbdd19c42e3d939f"} Apr 20 17:51:24.150769 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:24.150629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmh2c" event={"ID":"ad44e0ba-0906-4c2e-a55b-edba31d4f5df","Type":"ContainerStarted","Data":"27a2d927b4df98a497b69247d17c526fd95e1b08f949caeae47d7b378af53eb3"} Apr 20 17:51:25.155065 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.155030 2575 generic.go:358] "Generic (PLEG): container finished" podID="ad44e0ba-0906-4c2e-a55b-edba31d4f5df" containerID="daebb0bb7c4c90616b40b11867081b35ebb6040f6b051e5149f44ff23b2aad95" exitCode=0 Apr 20 17:51:25.155495 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.155117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmh2c" event={"ID":"ad44e0ba-0906-4c2e-a55b-edba31d4f5df","Type":"ContainerDied","Data":"daebb0bb7c4c90616b40b11867081b35ebb6040f6b051e5149f44ff23b2aad95"} Apr 20 17:51:25.157120 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.157099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" event={"ID":"33f6c414-d3f6-4ff7-b22e-e998f3e790bf","Type":"ContainerStarted","Data":"34e23ed254a71db09ddc1e431c50a7367493e6be2bb0deac1ee46f580a0a110b"} Apr 20 17:51:25.157120 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.157123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" event={"ID":"33f6c414-d3f6-4ff7-b22e-e998f3e790bf","Type":"ContainerStarted","Data":"15f2df0e33feac15c87fc1ab429129e04256bbba16640233413be165f3022679"} Apr 20 17:51:25.200206 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.200183 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7fc96f9886-8bm76"] Apr 20 17:51:25.203281 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.203232 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-kvg82" podStartSLOduration=1.88213985 podStartE2EDuration="3.203216937s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:22.697476513 +0000 UTC m=+173.790318954" lastFinishedPulling="2026-04-20 17:51:24.018553613 +0000 UTC m=+175.111396041" observedRunningTime="2026-04-20 17:51:25.202462491 +0000 UTC m=+176.295304944" watchObservedRunningTime="2026-04-20 17:51:25.203216937 +0000 UTC m=+176.296059387" Apr 20 17:51:25.205232 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.205214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.208049 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208019 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 17:51:25.208049 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208038 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 17:51:25.208208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208181 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 17:51:25.208542 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208267 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-c1gh1o96a9hg5\"" Apr 20 17:51:25.208542 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pmgmn\"" Apr 20 17:51:25.208542 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208346 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 17:51:25.208542 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.208365 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 17:51:25.215787 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.215702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fc96f9886-8bm76"] Apr 20 17:51:25.240858 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.240828 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.240953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.240874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-grpc-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.240953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.240926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.241083 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.240969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.241083 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.241039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.241198 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.241080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.241198 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.241176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5659e814-9376-4138-a459-a38bc362b01b-metrics-client-ca\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.241291 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.241206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9bb2\" (UniqueName: \"kubernetes.io/projected/5659e814-9376-4138-a459-a38bc362b01b-kube-api-access-t9bb2\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.341803 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.341803 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-grpc-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342007 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.341973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5659e814-9376-4138-a459-a38bc362b01b-metrics-client-ca\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.342261 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.342002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9bb2\" (UniqueName: \"kubernetes.io/projected/5659e814-9376-4138-a459-a38bc362b01b-kube-api-access-t9bb2\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.343377 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.343327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5659e814-9376-4138-a459-a38bc362b01b-metrics-client-ca\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.344465 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.344441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.344603 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.344586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.344831 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.344812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.344916 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.344850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-grpc-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.344916 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.344865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-tls\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.345177 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.345159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5659e814-9376-4138-a459-a38bc362b01b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.355790 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.355766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9bb2\" (UniqueName: \"kubernetes.io/projected/5659e814-9376-4138-a459-a38bc362b01b-kube-api-access-t9bb2\") pod \"thanos-querier-7fc96f9886-8bm76\" (UID: \"5659e814-9376-4138-a459-a38bc362b01b\") " pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.514773 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.514709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:25.633579 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:25.633557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fc96f9886-8bm76"] Apr 20 17:51:25.637371 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:25.637344 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5659e814_9376_4138_a459_a38bc362b01b.slice/crio-4fc22bb83182f621fb59c710d85481dc04148f050605901f0177f6c2fc7d36d8 WatchSource:0}: Error finding container 4fc22bb83182f621fb59c710d85481dc04148f050605901f0177f6c2fc7d36d8: Status 404 returned error can't find the container with id 4fc22bb83182f621fb59c710d85481dc04148f050605901f0177f6c2fc7d36d8 Apr 20 17:51:26.161066 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.161033 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"4fc22bb83182f621fb59c710d85481dc04148f050605901f0177f6c2fc7d36d8"} Apr 20 17:51:26.162934 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.162907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmh2c" event={"ID":"ad44e0ba-0906-4c2e-a55b-edba31d4f5df","Type":"ContainerStarted","Data":"7436f0d2dff2afe7d769291be9722bde37fea3e2ea48c83ed8c6c8ea9e1df635"} Apr 20 17:51:26.163059 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.162939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qmh2c" event={"ID":"ad44e0ba-0906-4c2e-a55b-edba31d4f5df","Type":"ContainerStarted","Data":"8ef10ffb91a5b366de734d40716431c172917612cd9eba5c76c2469f0925d3f6"} Apr 20 17:51:26.186679 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.186635 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qmh2c" podStartSLOduration=3.551364042 podStartE2EDuration="4.186621387s" podCreationTimestamp="2026-04-20 17:51:22 +0000 UTC" firstStartedPulling="2026-04-20 17:51:24.108983253 +0000 UTC m=+175.201825683" lastFinishedPulling="2026-04-20 17:51:24.744240593 +0000 UTC m=+175.837083028" observedRunningTime="2026-04-20 17:51:26.185761014 +0000 UTC m=+177.278603465" watchObservedRunningTime="2026-04-20 17:51:26.186621387 +0000 UTC m=+177.279463836" Apr 20 17:51:26.954906 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.954872 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm"] Apr 20 17:51:26.958067 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.958049 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:26.960648 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.960628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 17:51:26.960901 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.960886 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-qz9v2\"" Apr 20 17:51:26.967865 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:26.967840 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm"] Apr 20 17:51:27.056143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.056110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2brbm\" (UID: \"22557b24-c8ee-4a18-949c-0102a598d5a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:27.157748 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.157706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2brbm\" (UID: \"22557b24-c8ee-4a18-949c-0102a598d5a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:27.157995 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:27.157871 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 17:51:27.157995 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:51:27.157946 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert podName:22557b24-c8ee-4a18-949c-0102a598d5a9 nodeName:}" failed. No retries permitted until 2026-04-20 17:51:27.657925062 +0000 UTC m=+178.750767507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-2brbm" (UID: "22557b24-c8ee-4a18-949c-0102a598d5a9") : secret "monitoring-plugin-cert" not found Apr 20 17:51:27.665670 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.665631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2brbm\" (UID: \"22557b24-c8ee-4a18-949c-0102a598d5a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:27.667972 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.667941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/22557b24-c8ee-4a18-949c-0102a598d5a9-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-2brbm\" (UID: \"22557b24-c8ee-4a18-949c-0102a598d5a9\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:27.867590 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.867565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:27.988233 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:27.988190 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm"] Apr 20 17:51:27.990834 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:27.990810 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22557b24_c8ee_4a18_949c_0102a598d5a9.slice/crio-a18a25607076f76b11f2011f911445481b7d36275e9872d864bd32428749a78a WatchSource:0}: Error finding container a18a25607076f76b11f2011f911445481b7d36275e9872d864bd32428749a78a: Status 404 returned error can't find the container with id a18a25607076f76b11f2011f911445481b7d36275e9872d864bd32428749a78a Apr 20 17:51:28.171240 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.171142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"3ed7191830ea56ccc8e9c1bbdc6938c6733994e081020a46d0d7c37ba459a400"} Apr 20 17:51:28.171240 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.171192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"0e8ff1495e9b1aaf024217028b85223fa3c3d43aafc2fae2d3c74d1896f79c0c"} Apr 20 17:51:28.171240 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.171206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"3c592a3fb307215738811d45e303f336f744427fc9b85e2694841484fed3cd5e"} Apr 20 17:51:28.172140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.172113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" event={"ID":"22557b24-c8ee-4a18-949c-0102a598d5a9","Type":"ContainerStarted","Data":"a18a25607076f76b11f2011f911445481b7d36275e9872d864bd32428749a78a"} Apr 20 17:51:28.457484 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.457394 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:51:28.463120 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.463075 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.465964 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.465834 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 17:51:28.465964 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.465853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 17:51:28.465964 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.465836 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 17:51:28.466292 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466266 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-krmvm\"" Apr 20 17:51:28.466410 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466325 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 17:51:28.466410 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bqbbd1gb1l32p\"" Apr 20 17:51:28.466603 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 17:51:28.466716 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 17:51:28.466716 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 17:51:28.466941 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.466921 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 17:51:28.467224 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.467204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 17:51:28.467346 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.467324 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 17:51:28.467446 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.467389 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 17:51:28.467508 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.467487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 17:51:28.469829 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.469808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 17:51:28.484635 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.484603 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:51:28.572088 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572088 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fkv\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572541 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572827 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572827 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.572827 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.572635 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.673516 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.673516 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96fkv\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.673995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.674017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.674293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.674052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.675907 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.675577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.676469 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.676127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.678328 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.678019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.678943 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.678889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.679458 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.679434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.679794 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.679772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.679889 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.679867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.680314 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.680292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.680942 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.680900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.683854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.682929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.683854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.683103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.683854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.683406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.683854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.683613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.683854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.683745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.684150 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.684002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.684150 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.684127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.684373 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.684352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.686409 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.686388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fkv\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv\") pod \"prometheus-k8s-0\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.775630 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.775600 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:28.932984 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:28.932941 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:51:28.938721 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:51:28.938676 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e3f09a_7b0b_44a7_bf48_7781ae1175f2.slice/crio-977c04f9d07f14cf7f7e7881e06f1f7fc91b7cc238c33b2300c83885e12e403f WatchSource:0}: Error finding container 977c04f9d07f14cf7f7e7881e06f1f7fc91b7cc238c33b2300c83885e12e403f: Status 404 returned error can't find the container with id 977c04f9d07f14cf7f7e7881e06f1f7fc91b7cc238c33b2300c83885e12e403f Apr 20 17:51:29.101515 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.101426 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-985f6489d-cxh24" Apr 20 17:51:29.176656 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.176613 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"977c04f9d07f14cf7f7e7881e06f1f7fc91b7cc238c33b2300c83885e12e403f"} Apr 20 17:51:29.179551 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.179525 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"ccafe716e12e569d53368f816121fbca7fda435ce711ffad0495ce012ba0f13e"} Apr 20 17:51:29.179717 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.179557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"46320be804614a5c59a8bb439b58aff9c9083b19852940a95fa2e61266244491"} Apr 20 17:51:29.179717 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.179573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" event={"ID":"5659e814-9376-4138-a459-a38bc362b01b","Type":"ContainerStarted","Data":"fe626c615569d21cd1f64896cee096bfad2c21b2dc3e52cd5dbb62685429e0ca"} Apr 20 17:51:29.179811 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.179729 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:29.210862 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:29.210633 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" podStartSLOduration=1.108552565 podStartE2EDuration="4.210616146s" podCreationTimestamp="2026-04-20 17:51:25 +0000 UTC" firstStartedPulling="2026-04-20 17:51:25.639255595 +0000 UTC m=+176.732098023" lastFinishedPulling="2026-04-20 17:51:28.741319164 +0000 UTC m=+179.834161604" observedRunningTime="2026-04-20 17:51:29.209380571 +0000 UTC m=+180.302223021" watchObservedRunningTime="2026-04-20 17:51:29.210616146 +0000 UTC m=+180.303458597" Apr 20 17:51:30.184043 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:30.184016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" event={"ID":"22557b24-c8ee-4a18-949c-0102a598d5a9","Type":"ContainerStarted","Data":"a904ddebf3936db3acc29dd5f0b3ac3862471d768918474b23e1a42b13e2b73d"} Apr 20 17:51:30.184381 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:30.184163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:30.189090 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:30.189071 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" Apr 20 17:51:30.199398 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:30.199359 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-2brbm" podStartSLOduration=2.61506281 podStartE2EDuration="4.199348368s" podCreationTimestamp="2026-04-20 17:51:26 +0000 UTC" firstStartedPulling="2026-04-20 17:51:27.992630587 +0000 UTC m=+179.085473015" lastFinishedPulling="2026-04-20 17:51:29.576916142 +0000 UTC m=+180.669758573" observedRunningTime="2026-04-20 17:51:30.197982479 +0000 UTC m=+181.290824929" watchObservedRunningTime="2026-04-20 17:51:30.199348368 +0000 UTC m=+181.292190876" Apr 20 17:51:31.188922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:31.188885 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" exitCode=0 Apr 20 17:51:31.189306 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:31.188972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} Apr 20 17:51:34.207547 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:34.207504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} Apr 20 17:51:34.208009 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:34.207558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} Apr 20 17:51:35.190703 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.190662 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7fc96f9886-8bm76" Apr 20 17:51:35.217574 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.217539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} Apr 20 17:51:35.217574 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.217581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} Apr 20 17:51:35.218012 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.217594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} Apr 20 17:51:35.218012 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.217619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerStarted","Data":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} Apr 20 17:51:35.251813 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:35.251753 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.176383907 podStartE2EDuration="7.251740486s" podCreationTimestamp="2026-04-20 17:51:28 +0000 UTC" firstStartedPulling="2026-04-20 17:51:28.940733118 +0000 UTC m=+180.033575550" lastFinishedPulling="2026-04-20 17:51:34.016089698 +0000 UTC m=+185.108932129" observedRunningTime="2026-04-20 17:51:35.249991809 +0000 UTC m=+186.342834262" watchObservedRunningTime="2026-04-20 17:51:35.251740486 +0000 UTC m=+186.344582935" Apr 20 17:51:38.776257 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:38.776228 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:51:49.255658 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:49.255625 2575 generic.go:358] "Generic (PLEG): container finished" podID="7ac7cee7-8495-490d-92f9-c31987536747" containerID="3dd8cd53ced89b27ea639e84f2ea57d68006145fcd84e9385a05b856c550f0ef" exitCode=0 Apr 20 17:51:49.256043 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:49.255717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" event={"ID":"7ac7cee7-8495-490d-92f9-c31987536747","Type":"ContainerDied","Data":"3dd8cd53ced89b27ea639e84f2ea57d68006145fcd84e9385a05b856c550f0ef"} Apr 20 17:51:49.256043 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:49.256019 2575 scope.go:117] "RemoveContainer" containerID="3dd8cd53ced89b27ea639e84f2ea57d68006145fcd84e9385a05b856c550f0ef" Apr 20 17:51:50.260457 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:51:50.260425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wtp2c" event={"ID":"7ac7cee7-8495-490d-92f9-c31987536747","Type":"ContainerStarted","Data":"416e421375f398c41174ba97a0871a4bd3295d8b04998087677dc8e934f7c75f"} Apr 20 17:52:28.775815 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:28.775771 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:28.791160 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:28.791138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:29.383445 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:29.383420 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:40.273789 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:40.273744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:52:40.275998 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:40.275970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07e151c2-7294-492d-b56b-1fc480d9ab69-metrics-certs\") pod \"network-metrics-daemon-rlsjc\" (UID: \"07e151c2-7294-492d-b56b-1fc480d9ab69\") " pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:52:40.347301 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:40.347276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddm2v\"" Apr 20 17:52:40.355366 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:40.355349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rlsjc" Apr 20 17:52:40.473070 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:40.473049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rlsjc"] Apr 20 17:52:40.475821 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:52:40.475793 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07e151c2_7294_492d_b56b_1fc480d9ab69.slice/crio-b8b5a0f0490ab7793797a8923b2875bb4cec27ca5ac68ae8991e85400b8649fe WatchSource:0}: Error finding container b8b5a0f0490ab7793797a8923b2875bb4cec27ca5ac68ae8991e85400b8649fe: Status 404 returned error can't find the container with id b8b5a0f0490ab7793797a8923b2875bb4cec27ca5ac68ae8991e85400b8649fe Apr 20 17:52:41.401775 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:41.401728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rlsjc" event={"ID":"07e151c2-7294-492d-b56b-1fc480d9ab69","Type":"ContainerStarted","Data":"b8b5a0f0490ab7793797a8923b2875bb4cec27ca5ac68ae8991e85400b8649fe"} Apr 20 17:52:42.405966 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:42.405928 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rlsjc" event={"ID":"07e151c2-7294-492d-b56b-1fc480d9ab69","Type":"ContainerStarted","Data":"c8dcc0c4d23f7a421057e06bd5a3cb8d82332aa7ddca70b6b2be82738d22aa37"} Apr 20 17:52:42.405966 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:42.405965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rlsjc" event={"ID":"07e151c2-7294-492d-b56b-1fc480d9ab69","Type":"ContainerStarted","Data":"0a598c2032d222659d5a6f23cefee484663a5208708e1f387847e62309a0fa9c"} Apr 20 17:52:42.423752 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:42.423708 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rlsjc" podStartSLOduration=252.439966238 podStartE2EDuration="4m13.423673859s" podCreationTimestamp="2026-04-20 17:48:29 +0000 UTC" firstStartedPulling="2026-04-20 17:52:40.477451484 +0000 UTC m=+251.570293912" lastFinishedPulling="2026-04-20 17:52:41.461159091 +0000 UTC m=+252.554001533" observedRunningTime="2026-04-20 17:52:42.422043688 +0000 UTC m=+253.514886137" watchObservedRunningTime="2026-04-20 17:52:42.423673859 +0000 UTC m=+253.516516309" Apr 20 17:52:46.860321 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.860287 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:46.860934 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.860900 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="prometheus" containerID="cri-o://f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" gracePeriod=600 Apr 20 17:52:46.861034 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.860927 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="config-reloader" containerID="cri-o://aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" gracePeriod=600 Apr 20 17:52:46.861034 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.860928 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-web" containerID="cri-o://e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" gracePeriod=600 Apr 20 17:52:46.861143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.861008 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy" containerID="cri-o://282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" gracePeriod=600 Apr 20 17:52:46.861143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.860927 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="thanos-sidecar" containerID="cri-o://c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" gracePeriod=600 Apr 20 17:52:46.861243 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:46.861125 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-thanos" containerID="cri-o://cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" gracePeriod=600 Apr 20 17:52:47.091325 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.091302 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.229348 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229320 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229348 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229354 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229596 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229385 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229596 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229410 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96fkv\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229596 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229443 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229596 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229471 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229596 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.229942 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229912 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:47.230004 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229938 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:47.230004 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.229570 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.230106 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.230013 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.230106 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.230069 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.230514 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.230457 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:47.230625 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.230533 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231547 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231523 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231649 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231559 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231649 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231583 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231649 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231594 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:52:47.231649 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231611 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231649 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231640 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231880 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231706 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231880 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231742 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle\") pod \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\" (UID: \"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2\") " Apr 20 17:52:47.231880 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.231861 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config" (OuterVolumeSpecName: "config") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.232087 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232069 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-db\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.232140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232093 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.232140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232107 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.232140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232122 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.232140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232136 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-configmap-metrics-client-ca\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.232334 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232225 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:47.232334 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232289 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.232441 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232357 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv" (OuterVolumeSpecName: "kube-api-access-96fkv") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "kube-api-access-96fkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:52:47.232507 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232479 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.232566 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232533 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.232622 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.232598 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:52:47.233760 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.233732 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.233848 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.233825 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 17:52:47.234174 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.234152 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.234592 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.234560 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out" (OuterVolumeSpecName: "config-out") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:52:47.234654 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.234637 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.234929 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.234915 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.244638 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.244609 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config" (OuterVolumeSpecName: "web-config") pod "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" (UID: "e0e3f09a-7b0b-44a7-bf48-7781ae1175f2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 17:52:47.332959 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332932 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.332959 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332957 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-kube-rbac-proxy\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332968 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-config-out\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332979 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332988 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.332998 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-grpc-tls\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333008 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-web-config\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333017 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333026 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-metrics-client-certs\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333034 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-tls-assets\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333043 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96fkv\" (UniqueName: \"kubernetes.io/projected/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-kube-api-access-96fkv\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333052 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.333105 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.333060 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:52:47.424752 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424719 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" exitCode=0 Apr 20 17:52:47.424752 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424745 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" exitCode=0 Apr 20 17:52:47.424752 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424754 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" exitCode=0 Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424765 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" exitCode=0 Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424773 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" exitCode=0 Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424781 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" exitCode=0 Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424848 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424916 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.424922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.424923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e0e3f09a-7b0b-44a7-bf48-7781ae1175f2","Type":"ContainerDied","Data":"977c04f9d07f14cf7f7e7881e06f1f7fc91b7cc238c33b2300c83885e12e403f"} Apr 20 17:52:47.435194 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.435175 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.443108 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.443092 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.450296 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.450276 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.451135 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.451113 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:47.457242 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.457223 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.457297 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.457228 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:47.463360 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.463343 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.469730 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.469714 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.475582 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.475565 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.475843 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.475821 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.475909 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.475850 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.475909 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.475881 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.476109 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.476092 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.476149 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476115 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.476149 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476131 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.476357 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.476337 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.476406 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476362 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.476406 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476376 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.476552 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.476539 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.476594 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476559 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.476594 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476572 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.476859 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.476844 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.476922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476862 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.476922 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.476875 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.477090 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.477073 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.477128 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477099 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.477128 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477119 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.477349 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:52:47.477334 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.477396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477353 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.477396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477366 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.477567 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477551 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.477616 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477570 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.477791 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477771 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.477791 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.477789 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.478031 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478012 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.478101 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478032 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.478223 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478207 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.478290 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478226 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.478418 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478402 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.478480 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478419 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.478616 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478589 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.478616 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478610 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.478812 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478787 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.478893 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478812 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.478983 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478967 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.479033 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.478984 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.479160 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479146 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.479211 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479160 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.479324 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479305 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.479403 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479324 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.479530 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479511 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.479575 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479531 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.479771 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479748 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.479855 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479775 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.480001 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479979 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.480001 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.479998 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.480226 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480206 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.480226 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480228 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.480494 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480476 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.480546 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480495 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.480744 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480722 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.480813 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480744 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.480969 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480951 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.481011 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.480969 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.481196 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481180 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.481248 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481196 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.481381 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481366 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.481436 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481389 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.481601 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481585 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.481668 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481603 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.481854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481835 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.481905 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.481855 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.482080 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482062 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.482145 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482081 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.482306 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482288 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.482372 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482307 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.482546 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482530 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.482616 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482547 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.482773 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482755 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.482840 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482775 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.482997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482977 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.483053 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.482999 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.483214 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483188 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.483214 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483212 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.483452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483432 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.483452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483452 2575 scope.go:117] "RemoveContainer" containerID="cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9" Apr 20 17:52:47.483671 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483652 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9"} err="failed to get container status \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": rpc error: code = NotFound desc = could not find container \"cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9\": container with ID starting with cac1c427e0e029e09496b3674228ebf68d85586671ebc1728003e6048f532bc9 not found: ID does not exist" Apr 20 17:52:47.483671 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483670 2575 scope.go:117] "RemoveContainer" containerID="282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e" Apr 20 17:52:47.483930 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483913 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e"} err="failed to get container status \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": rpc error: code = NotFound desc = could not find container \"282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e\": container with ID starting with 282a1901c5a7d020df9f9aff4a3cee161373c2a599aed120d13bd71d186d3e2e not found: ID does not exist" Apr 20 17:52:47.483930 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.483930 2575 scope.go:117] "RemoveContainer" containerID="e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635" Apr 20 17:52:47.484134 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484116 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635"} err="failed to get container status \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": rpc error: code = NotFound desc = could not find container \"e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635\": container with ID starting with e0b69931de8abbc03222bf1f2e00ed432dcf62f97d0375c8feb30a3529640635 not found: ID does not exist" Apr 20 17:52:47.484207 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484135 2575 scope.go:117] "RemoveContainer" containerID="c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1" Apr 20 17:52:47.484347 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484332 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1"} err="failed to get container status \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": rpc error: code = NotFound desc = could not find container \"c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1\": container with ID starting with c377547cab5d040825b0f5407cd7769210f482a14c3783cacb25dc3b41f5f5b1 not found: ID does not exist" Apr 20 17:52:47.484347 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484346 2575 scope.go:117] "RemoveContainer" containerID="aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf" Apr 20 17:52:47.484509 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484490 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf"} err="failed to get container status \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": rpc error: code = NotFound desc = could not find container \"aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf\": container with ID starting with aaeb82b321b7d2e2304f723f274f7361322ae9c81761d4fab1c8dfcfa6b49bcf not found: ID does not exist" Apr 20 17:52:47.484568 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484510 2575 scope.go:117] "RemoveContainer" containerID="f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0" Apr 20 17:52:47.484724 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484707 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0"} err="failed to get container status \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": rpc error: code = NotFound desc = could not find container \"f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0\": container with ID starting with f36a100de7447791ad01334b6bf27e912673e43b3b52f810ef21042b89d921e0 not found: ID does not exist" Apr 20 17:52:47.484724 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484723 2575 scope.go:117] "RemoveContainer" containerID="10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a" Apr 20 17:52:47.484915 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.484897 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a"} err="failed to get container status \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": rpc error: code = NotFound desc = could not find container \"10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a\": container with ID starting with 10cc985faf6f5eff2081de771481730dba0d2e9a095d4f489c0f15f585cf392a not found: ID does not exist" Apr 20 17:52:47.495953 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.495928 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:47.496400 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496375 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-thanos" Apr 20 17:52:47.496474 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496407 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-thanos" Apr 20 17:52:47.496474 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496428 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="init-config-reloader" Apr 20 17:52:47.496474 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496440 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="init-config-reloader" Apr 20 17:52:47.496474 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496462 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-web" Apr 20 17:52:47.496474 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496473 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-web" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496491 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496503 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496523 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="prometheus" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496534 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="prometheus" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496546 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="thanos-sidecar" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496557 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="thanos-sidecar" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496571 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="config-reloader" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496582 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="config-reloader" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496662 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-web" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496678 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="prometheus" Apr 20 17:52:47.496740 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496739 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy-thanos" Apr 20 17:52:47.497154 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496756 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="kube-rbac-proxy" Apr 20 17:52:47.497154 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496768 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="config-reloader" Apr 20 17:52:47.497154 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.496782 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" containerName="thanos-sidecar" Apr 20 17:52:47.499389 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.499370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.504082 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 17:52:47.504183 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504091 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 17:52:47.504254 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504240 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 17:52:47.504505 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504458 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 17:52:47.504505 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504466 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 17:52:47.504505 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504500 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 17:52:47.504736 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504517 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-krmvm\"" Apr 20 17:52:47.504736 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504459 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 17:52:47.505087 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504871 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 17:52:47.505087 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.504873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 17:52:47.505087 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.505047 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 17:52:47.505790 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.505763 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bqbbd1gb1l32p\"" Apr 20 17:52:47.505879 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.505770 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 17:52:47.509209 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.509191 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 17:52:47.511235 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.511220 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 17:52:47.517440 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.517418 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:47.547706 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.547661 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e3f09a-7b0b-44a7-bf48-7781ae1175f2" path="/var/lib/kubelet/pods/e0e3f09a-7b0b-44a7-bf48-7781ae1175f2/volumes" Apr 20 17:52:47.635487 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635487 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-web-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635611 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmqvv\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-kube-api-access-fmqvv\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.635814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config-out\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635823 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.635976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.636026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636109 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.636068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636319 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.636112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636319 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.636170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.636319 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.636219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.736793 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.736793 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmqvv\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-kube-api-access-fmqvv\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.736793 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config-out\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736908 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.736968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737151 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-web-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.737405 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.737243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.738176 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.738150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.739291 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.739259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config-out\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740054 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.739557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740054 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.739710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740054 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.739873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740054 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.739918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.740061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.740075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740413 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.740380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740653 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.740629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.740828 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.740806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.741194 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.741168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.741396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.741373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.741890 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.741871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.742308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.742291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.742993 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.742973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-web-config\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.743514 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.743497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.745282 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.745266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmqvv\" (UniqueName: \"kubernetes.io/projected/3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9-kube-api-access-fmqvv\") pod \"prometheus-k8s-0\" (UID: \"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.810359 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.810337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:52:47.945585 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:47.945529 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 17:52:47.947827 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:52:47.947786 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f2bc841_2a6c_4bd1_9ca7_445ea635dbe9.slice/crio-c883624639e9b0b25b3b9545d0e0b9534938b3fe2046f881c15cc32fc088675d WatchSource:0}: Error finding container c883624639e9b0b25b3b9545d0e0b9534938b3fe2046f881c15cc32fc088675d: Status 404 returned error can't find the container with id c883624639e9b0b25b3b9545d0e0b9534938b3fe2046f881c15cc32fc088675d Apr 20 17:52:48.430010 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:48.429974 2575 generic.go:358] "Generic (PLEG): container finished" podID="3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9" containerID="16628eeb338ea91f0b658bf21e879fc5bb4c8bed91ed4b18ed084d5140922a7a" exitCode=0 Apr 20 17:52:48.430148 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:48.430017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerDied","Data":"16628eeb338ea91f0b658bf21e879fc5bb4c8bed91ed4b18ed084d5140922a7a"} Apr 20 17:52:48.430148 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:48.430039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"c883624639e9b0b25b3b9545d0e0b9534938b3fe2046f881c15cc32fc088675d"} Apr 20 17:52:49.435096 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435065 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"15346af7eca4a253b8347610e31225007f23b3daff43b1c67546615683b0da4f"} Apr 20 17:52:49.435096 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"1fcd6927180f6af934d311205a62a1ebb188b309d9c805167c693f1557a7be7a"} Apr 20 17:52:49.435458 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435106 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"5e68e474f098eba56f503530193eaaf08c85f3049ae4d8c315ffbf90ecbe36db"} Apr 20 17:52:49.435458 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435116 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"43f89b013a66dcbc28496ed66646ffe1bc2eaba1c9c591e2e293847675ba53ff"} Apr 20 17:52:49.435458 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"4cfd1c4cbe269f386a24c29ae9344d887ded8d5019c8f8bac05eb665a70ae758"} Apr 20 17:52:49.435458 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.435133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9","Type":"ContainerStarted","Data":"33532e644e1bac79da7985136d98f4c2bf088d9552a67682511c9a7ff95a58fe"} Apr 20 17:52:49.466964 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:49.466917 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.466901047 podStartE2EDuration="2.466901047s" podCreationTimestamp="2026-04-20 17:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:52:49.465559589 +0000 UTC m=+260.558402049" watchObservedRunningTime="2026-04-20 17:52:49.466901047 +0000 UTC m=+260.559743498" Apr 20 17:52:52.810585 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:52:52.810553 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:53:29.411444 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:29.411414 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:53:29.412000 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:29.411631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:53:29.414295 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:29.414269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:53:29.414467 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:29.414348 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:53:29.421386 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:29.421368 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 17:53:47.811440 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:47.811403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:53:47.826549 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:47.826529 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:53:48.610400 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:53:48.610374 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 17:54:45.080816 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.080780 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pn6v5"] Apr 20 17:54:45.084027 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.084011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.086495 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.086473 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 17:54:45.093926 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.093905 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pn6v5"] Apr 20 17:54:45.121801 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.121776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-kubelet-config\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.121904 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.121804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/95da1003-73aa-4420-b974-8a7d21bee404-original-pull-secret\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.121904 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.121850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-dbus\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.222421 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.222396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-dbus\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.222529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.222450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-kubelet-config\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.222529 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.222477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/95da1003-73aa-4420-b974-8a7d21bee404-original-pull-secret\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.222602 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.222578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-dbus\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.222636 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.222580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/95da1003-73aa-4420-b974-8a7d21bee404-kubelet-config\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.224655 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.224632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/95da1003-73aa-4420-b974-8a7d21bee404-original-pull-secret\") pod \"global-pull-secret-syncer-pn6v5\" (UID: \"95da1003-73aa-4420-b974-8a7d21bee404\") " pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.393743 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.393713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pn6v5" Apr 20 17:54:45.508286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.508137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pn6v5"] Apr 20 17:54:45.510859 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:54:45.510832 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95da1003_73aa_4420_b974_8a7d21bee404.slice/crio-9ba1e4333cd1ac986c3f958f3a6cb01545c3bcd7a44460e815c68a3046483938 WatchSource:0}: Error finding container 9ba1e4333cd1ac986c3f958f3a6cb01545c3bcd7a44460e815c68a3046483938: Status 404 returned error can't find the container with id 9ba1e4333cd1ac986c3f958f3a6cb01545c3bcd7a44460e815c68a3046483938 Apr 20 17:54:45.512524 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.512508 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 17:54:45.751310 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:45.751235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pn6v5" event={"ID":"95da1003-73aa-4420-b974-8a7d21bee404","Type":"ContainerStarted","Data":"9ba1e4333cd1ac986c3f958f3a6cb01545c3bcd7a44460e815c68a3046483938"} Apr 20 17:54:50.767948 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:50.767912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pn6v5" event={"ID":"95da1003-73aa-4420-b974-8a7d21bee404","Type":"ContainerStarted","Data":"33e30fc7104e1d07fb78ede8273f2adadde7387c61dfd99387d1d93e3672cce5"} Apr 20 17:54:50.783295 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:54:50.783250 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pn6v5" podStartSLOduration=1.476048378 podStartE2EDuration="5.783238579s" podCreationTimestamp="2026-04-20 17:54:45 +0000 UTC" firstStartedPulling="2026-04-20 17:54:45.512625115 +0000 UTC m=+376.605467543" lastFinishedPulling="2026-04-20 17:54:49.819815314 +0000 UTC m=+380.912657744" observedRunningTime="2026-04-20 17:54:50.781970483 +0000 UTC m=+381.874812932" watchObservedRunningTime="2026-04-20 17:54:50.783238579 +0000 UTC m=+381.876081029" Apr 20 17:56:01.816360 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.816284 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp"] Apr 20 17:56:01.819526 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.819510 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:01.822182 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.822160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-sc5qn\"" Apr 20 17:56:01.822290 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.822224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 17:56:01.822441 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.822427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 17:56:01.822771 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.822754 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 17:56:01.822771 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.822762 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 17:56:01.839854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.839832 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp"] Apr 20 17:56:01.929987 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.929952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfcp7\" (UniqueName: \"kubernetes.io/projected/cc572493-2a3e-4b9f-a55e-486e62f87313-kube-api-access-zfcp7\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:01.930143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.929997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:01.930143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:01.930069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.030579 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.030542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfcp7\" (UniqueName: \"kubernetes.io/projected/cc572493-2a3e-4b9f-a55e-486e62f87313-kube-api-access-zfcp7\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.030771 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.030591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.030771 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.030628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.033127 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.033104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-webhook-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.033287 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.033265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc572493-2a3e-4b9f-a55e-486e62f87313-apiservice-cert\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.043587 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.043564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfcp7\" (UniqueName: \"kubernetes.io/projected/cc572493-2a3e-4b9f-a55e-486e62f87313-kube-api-access-zfcp7\") pod \"opendatahub-operator-controller-manager-b8c4c7886-t7jxp\" (UID: \"cc572493-2a3e-4b9f-a55e-486e62f87313\") " pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.130179 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.130150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:02.284935 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.284909 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp"] Apr 20 17:56:02.286672 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:56:02.286640 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc572493_2a3e_4b9f_a55e_486e62f87313.slice/crio-3a48c04fb3da482d346ad1e083344342c085d706bc433c63a9fd1a6d79af42e8 WatchSource:0}: Error finding container 3a48c04fb3da482d346ad1e083344342c085d706bc433c63a9fd1a6d79af42e8: Status 404 returned error can't find the container with id 3a48c04fb3da482d346ad1e083344342c085d706bc433c63a9fd1a6d79af42e8 Apr 20 17:56:02.976027 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:02.975974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" event={"ID":"cc572493-2a3e-4b9f-a55e-486e62f87313","Type":"ContainerStarted","Data":"3a48c04fb3da482d346ad1e083344342c085d706bc433c63a9fd1a6d79af42e8"} Apr 20 17:56:04.983595 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:04.983559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" event={"ID":"cc572493-2a3e-4b9f-a55e-486e62f87313","Type":"ContainerStarted","Data":"aea6b88c59624655b7e4e05f152152cee6d3dd00c82aa050343ed5f228a61f49"} Apr 20 17:56:04.984084 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:04.983715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:05.004041 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:05.003998 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" podStartSLOduration=1.453989121 podStartE2EDuration="4.003986419s" podCreationTimestamp="2026-04-20 17:56:01 +0000 UTC" firstStartedPulling="2026-04-20 17:56:02.288302836 +0000 UTC m=+453.381145264" lastFinishedPulling="2026-04-20 17:56:04.838300132 +0000 UTC m=+455.931142562" observedRunningTime="2026-04-20 17:56:05.003233746 +0000 UTC m=+456.096076196" watchObservedRunningTime="2026-04-20 17:56:05.003986419 +0000 UTC m=+456.096828870" Apr 20 17:56:06.709639 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.709600 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2"] Apr 20 17:56:06.713411 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.713389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.715878 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.715852 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 17:56:06.716904 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.716880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-sgp5t\"" Apr 20 17:56:06.716997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.716934 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 17:56:06.716997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.716969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 17:56:06.717099 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.716944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 17:56:06.717241 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.717227 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 17:56:06.725900 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.725880 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2"] Apr 20 17:56:06.777056 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.777030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92158631-16a6-4817-9a0c-0e7b403207e7-manager-config\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.777197 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.777060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7kg\" (UniqueName: \"kubernetes.io/projected/92158631-16a6-4817-9a0c-0e7b403207e7-kube-api-access-sw7kg\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.777197 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.777174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.777307 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.777217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.878199 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.878164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92158631-16a6-4817-9a0c-0e7b403207e7-manager-config\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.878199 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.878201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7kg\" (UniqueName: \"kubernetes.io/projected/92158631-16a6-4817-9a0c-0e7b403207e7-kube-api-access-sw7kg\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.878449 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.878249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.878449 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.878270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.878909 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.878885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/92158631-16a6-4817-9a0c-0e7b403207e7-manager-config\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.880658 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.880640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-metrics-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.880775 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.880759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92158631-16a6-4817-9a0c-0e7b403207e7-cert\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:06.890997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:06.890971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7kg\" (UniqueName: \"kubernetes.io/projected/92158631-16a6-4817-9a0c-0e7b403207e7-kube-api-access-sw7kg\") pod \"lws-controller-manager-7589d7b74d-w54d2\" (UID: \"92158631-16a6-4817-9a0c-0e7b403207e7\") " pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:07.023449 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:07.023353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:07.146710 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:07.146662 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2"] Apr 20 17:56:07.150758 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:56:07.150730 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92158631_16a6_4817_9a0c_0e7b403207e7.slice/crio-f8ec6b2eba7eba5d1461a298fc977d21b39a514d92d1ff238cb661925ae5e58a WatchSource:0}: Error finding container f8ec6b2eba7eba5d1461a298fc977d21b39a514d92d1ff238cb661925ae5e58a: Status 404 returned error can't find the container with id f8ec6b2eba7eba5d1461a298fc977d21b39a514d92d1ff238cb661925ae5e58a Apr 20 17:56:07.994452 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:07.994417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" event={"ID":"92158631-16a6-4817-9a0c-0e7b403207e7","Type":"ContainerStarted","Data":"f8ec6b2eba7eba5d1461a298fc977d21b39a514d92d1ff238cb661925ae5e58a"} Apr 20 17:56:15.018024 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:15.017989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" event={"ID":"92158631-16a6-4817-9a0c-0e7b403207e7","Type":"ContainerStarted","Data":"4fdbc1d7b816fcb5eb293926ee954bccc9bba6292d9015318168fa97931052bb"} Apr 20 17:56:15.018395 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:15.018098 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:15.035502 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:15.035453 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" podStartSLOduration=1.905786554 podStartE2EDuration="9.035440426s" podCreationTimestamp="2026-04-20 17:56:06 +0000 UTC" firstStartedPulling="2026-04-20 17:56:07.152759817 +0000 UTC m=+458.245602250" lastFinishedPulling="2026-04-20 17:56:14.282413679 +0000 UTC m=+465.375256122" observedRunningTime="2026-04-20 17:56:15.034255821 +0000 UTC m=+466.127098272" watchObservedRunningTime="2026-04-20 17:56:15.035440426 +0000 UTC m=+466.128282895" Apr 20 17:56:15.990025 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:15.989992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-b8c4c7886-t7jxp" Apr 20 17:56:26.023865 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:26.023827 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7589d7b74d-w54d2" Apr 20 17:56:48.345465 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.345428 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm"] Apr 20 17:56:48.357362 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.357339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.360107 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.360076 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 17:56:48.360264 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.360181 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-cmrpn\"" Apr 20 17:56:48.360942 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.360914 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm"] Apr 20 17:56:48.419251 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419397 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419397 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419397 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419397 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419397 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7503f84f-1a4e-4263-8f8a-dfd88c876194-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419613 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419613 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.419613 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.419473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5cns\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-kube-api-access-v5cns\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520749 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.520919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7503f84f-1a4e-4263-8f8a-dfd88c876194-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521187 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521187 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.520990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521187 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5cns\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-kube-api-access-v5cns\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521352 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521352 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521309 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521746 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521746 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521669 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.521962 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.521935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7503f84f-1a4e-4263-8f8a-dfd88c876194-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.523140 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.523121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.523261 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.523247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.532287 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.532265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.533308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.533282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5cns\" (UniqueName: \"kubernetes.io/projected/7503f84f-1a4e-4263-8f8a-dfd88c876194-kube-api-access-v5cns\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm\" (UID: \"7503f84f-1a4e-4263-8f8a-dfd88c876194\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.670288 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.670259 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:48.819200 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:56:48.819160 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7503f84f_1a4e_4263_8f8a_dfd88c876194.slice/crio-29aae6c20667e2cc3469bbbddd789351c17a823d80693373f977cf07d480db29 WatchSource:0}: Error finding container 29aae6c20667e2cc3469bbbddd789351c17a823d80693373f977cf07d480db29: Status 404 returned error can't find the container with id 29aae6c20667e2cc3469bbbddd789351c17a823d80693373f977cf07d480db29 Apr 20 17:56:48.820714 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:48.820675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm"] Apr 20 17:56:49.122648 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:49.122615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" event={"ID":"7503f84f-1a4e-4263-8f8a-dfd88c876194","Type":"ContainerStarted","Data":"29aae6c20667e2cc3469bbbddd789351c17a823d80693373f977cf07d480db29"} Apr 20 17:56:51.207616 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:51.207577 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:56:51.207882 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:51.207655 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:56:51.207882 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:51.207684 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 17:56:52.134063 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:52.134021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" event={"ID":"7503f84f-1a4e-4263-8f8a-dfd88c876194","Type":"ContainerStarted","Data":"e1d8c4d853b6d258997ffd61dab6de8aaffa7e66a86e523d09a306bbdbee621a"} Apr 20 17:56:52.157394 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:52.157346 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" podStartSLOduration=1.771341544 podStartE2EDuration="4.157332331s" podCreationTimestamp="2026-04-20 17:56:48 +0000 UTC" firstStartedPulling="2026-04-20 17:56:48.821349764 +0000 UTC m=+499.914192192" lastFinishedPulling="2026-04-20 17:56:51.207340551 +0000 UTC m=+502.300182979" observedRunningTime="2026-04-20 17:56:52.155073042 +0000 UTC m=+503.247915503" watchObservedRunningTime="2026-04-20 17:56:52.157332331 +0000 UTC m=+503.250174781" Apr 20 17:56:52.670658 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:52.670624 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:52.675349 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:52.675328 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:53.137572 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:53.137542 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:56:53.138617 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:56:53.138600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm" Apr 20 17:57:19.336818 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.336732 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:19.341155 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.341133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:19.343785 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.343764 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 17:57:19.343996 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.343981 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 17:57:19.344060 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.344009 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-bbn79\"" Apr 20 17:57:19.351100 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.351078 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:19.378084 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.378057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6cvb\" (UniqueName: \"kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb\") pod \"kuadrant-operator-catalog-m2f92\" (UID: \"b4828b04-7947-44db-a1f8-77edb4e7b7e7\") " pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:19.478952 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.478921 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6cvb\" (UniqueName: \"kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb\") pod \"kuadrant-operator-catalog-m2f92\" (UID: \"b4828b04-7947-44db-a1f8-77edb4e7b7e7\") " pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:19.486838 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.486811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6cvb\" (UniqueName: \"kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb\") pod \"kuadrant-operator-catalog-m2f92\" (UID: \"b4828b04-7947-44db-a1f8-77edb4e7b7e7\") " pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:19.651741 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.651716 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:19.697833 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.697802 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:19.772227 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.772196 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:19.774826 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:57:19.774798 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4828b04_7947_44db_a1f8_77edb4e7b7e7.slice/crio-db66f3fa972f93024877625c55a6d7b84e3f4aa13a913e3294003c75def89888 WatchSource:0}: Error finding container db66f3fa972f93024877625c55a6d7b84e3f4aa13a913e3294003c75def89888: Status 404 returned error can't find the container with id db66f3fa972f93024877625c55a6d7b84e3f4aa13a913e3294003c75def89888 Apr 20 17:57:19.908104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.908030 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4p8k5"] Apr 20 17:57:19.912325 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.912310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:19.918523 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.918495 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4p8k5"] Apr 20 17:57:19.982759 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:19.982726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4dcr\" (UniqueName: \"kubernetes.io/projected/bd054fa6-d033-4a47-aeeb-71d4e3320397-kube-api-access-j4dcr\") pod \"kuadrant-operator-catalog-4p8k5\" (UID: \"bd054fa6-d033-4a47-aeeb-71d4e3320397\") " pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:20.083503 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:20.083472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4dcr\" (UniqueName: \"kubernetes.io/projected/bd054fa6-d033-4a47-aeeb-71d4e3320397-kube-api-access-j4dcr\") pod \"kuadrant-operator-catalog-4p8k5\" (UID: \"bd054fa6-d033-4a47-aeeb-71d4e3320397\") " pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:20.091566 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:20.091546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4dcr\" (UniqueName: \"kubernetes.io/projected/bd054fa6-d033-4a47-aeeb-71d4e3320397-kube-api-access-j4dcr\") pod \"kuadrant-operator-catalog-4p8k5\" (UID: \"bd054fa6-d033-4a47-aeeb-71d4e3320397\") " pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:20.222645 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:20.222569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:20.222999 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:20.222957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" event={"ID":"b4828b04-7947-44db-a1f8-77edb4e7b7e7","Type":"ContainerStarted","Data":"db66f3fa972f93024877625c55a6d7b84e3f4aa13a913e3294003c75def89888"} Apr 20 17:57:20.344145 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:20.344119 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-4p8k5"] Apr 20 17:57:20.346946 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:57:20.346902 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd054fa6_d033_4a47_aeeb_71d4e3320397.slice/crio-29f2984319ac7ea5bab5a977e8707f4c741dac27666b1b2f742d94cacd4fe21e WatchSource:0}: Error finding container 29f2984319ac7ea5bab5a977e8707f4c741dac27666b1b2f742d94cacd4fe21e: Status 404 returned error can't find the container with id 29f2984319ac7ea5bab5a977e8707f4c741dac27666b1b2f742d94cacd4fe21e Apr 20 17:57:21.227020 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:21.226988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" event={"ID":"bd054fa6-d033-4a47-aeeb-71d4e3320397","Type":"ContainerStarted","Data":"29f2984319ac7ea5bab5a977e8707f4c741dac27666b1b2f742d94cacd4fe21e"} Apr 20 17:57:22.231641 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.231537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" event={"ID":"bd054fa6-d033-4a47-aeeb-71d4e3320397","Type":"ContainerStarted","Data":"bfacd576dea1b92e563e17d44fdfd521831d5897fe0a13e190d6f2a1f49d82ee"} Apr 20 17:57:22.232919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.232891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" event={"ID":"b4828b04-7947-44db-a1f8-77edb4e7b7e7","Type":"ContainerStarted","Data":"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8"} Apr 20 17:57:22.233057 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.233005 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" podUID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" containerName="registry-server" containerID="cri-o://277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8" gracePeriod=2 Apr 20 17:57:22.248578 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.248541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" podStartSLOduration=1.642721942 podStartE2EDuration="3.24852683s" podCreationTimestamp="2026-04-20 17:57:19 +0000 UTC" firstStartedPulling="2026-04-20 17:57:20.348372369 +0000 UTC m=+531.441214811" lastFinishedPulling="2026-04-20 17:57:21.954177271 +0000 UTC m=+533.047019699" observedRunningTime="2026-04-20 17:57:22.246747843 +0000 UTC m=+533.339590320" watchObservedRunningTime="2026-04-20 17:57:22.24852683 +0000 UTC m=+533.341369277" Apr 20 17:57:22.261860 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.261822 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" podStartSLOduration=1.08454595 podStartE2EDuration="3.261810132s" podCreationTimestamp="2026-04-20 17:57:19 +0000 UTC" firstStartedPulling="2026-04-20 17:57:19.7760888 +0000 UTC m=+530.868931228" lastFinishedPulling="2026-04-20 17:57:21.953352968 +0000 UTC m=+533.046195410" observedRunningTime="2026-04-20 17:57:22.260349739 +0000 UTC m=+533.353192189" watchObservedRunningTime="2026-04-20 17:57:22.261810132 +0000 UTC m=+533.354652582" Apr 20 17:57:22.473315 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.473291 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:22.504195 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.504124 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6cvb\" (UniqueName: \"kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb\") pod \"b4828b04-7947-44db-a1f8-77edb4e7b7e7\" (UID: \"b4828b04-7947-44db-a1f8-77edb4e7b7e7\") " Apr 20 17:57:22.506138 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.506118 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb" (OuterVolumeSpecName: "kube-api-access-p6cvb") pod "b4828b04-7947-44db-a1f8-77edb4e7b7e7" (UID: "b4828b04-7947-44db-a1f8-77edb4e7b7e7"). InnerVolumeSpecName "kube-api-access-p6cvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:57:22.605308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:22.605275 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p6cvb\" (UniqueName: \"kubernetes.io/projected/b4828b04-7947-44db-a1f8-77edb4e7b7e7-kube-api-access-p6cvb\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:57:23.236938 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.236852 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" containerID="277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8" exitCode=0 Apr 20 17:57:23.236938 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.236911 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" Apr 20 17:57:23.237398 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.236950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" event={"ID":"b4828b04-7947-44db-a1f8-77edb4e7b7e7","Type":"ContainerDied","Data":"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8"} Apr 20 17:57:23.237398 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.236988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-m2f92" event={"ID":"b4828b04-7947-44db-a1f8-77edb4e7b7e7","Type":"ContainerDied","Data":"db66f3fa972f93024877625c55a6d7b84e3f4aa13a913e3294003c75def89888"} Apr 20 17:57:23.237398 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.237008 2575 scope.go:117] "RemoveContainer" containerID="277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8" Apr 20 17:57:23.245313 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.245299 2575 scope.go:117] "RemoveContainer" containerID="277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8" Apr 20 17:57:23.245562 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:57:23.245545 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8\": container with ID starting with 277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8 not found: ID does not exist" containerID="277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8" Apr 20 17:57:23.245624 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.245574 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8"} err="failed to get container status \"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8\": rpc error: code = NotFound desc = could not find container \"277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8\": container with ID starting with 277d182420e431598e812ee10cd301f0c246ebf12ecfbcef96126f1a122940f8 not found: ID does not exist" Apr 20 17:57:23.256366 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.256343 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:23.261758 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.261741 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-m2f92"] Apr 20 17:57:23.547856 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:23.547783 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" path="/var/lib/kubelet/pods/b4828b04-7947-44db-a1f8-77edb4e7b7e7/volumes" Apr 20 17:57:30.222818 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:30.222779 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:30.222818 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:30.222826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:30.244193 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:30.244163 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:30.281206 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:30.281182 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-4p8k5" Apr 20 17:57:51.015014 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.014975 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm"] Apr 20 17:57:51.015498 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.015481 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" containerName="registry-server" Apr 20 17:57:51.015560 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.015499 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" containerName="registry-server" Apr 20 17:57:51.015611 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.015581 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4828b04-7947-44db-a1f8-77edb4e7b7e7" containerName="registry-server" Apr 20 17:57:51.018577 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.018557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:51.021352 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.021327 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 17:57:51.021613 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.021596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-zj9zp\"" Apr 20 17:57:51.029669 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.029649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm"] Apr 20 17:57:51.141059 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.141026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4q6\" (UniqueName: \"kubernetes.io/projected/10b2c614-c160-4005-8f18-16b08e8d7e1a-kube-api-access-2z4q6\") pod \"dns-operator-controller-manager-648d5c98bc-cz8hm\" (UID: \"10b2c614-c160-4005-8f18-16b08e8d7e1a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:51.241990 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.241959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4q6\" (UniqueName: \"kubernetes.io/projected/10b2c614-c160-4005-8f18-16b08e8d7e1a-kube-api-access-2z4q6\") pod \"dns-operator-controller-manager-648d5c98bc-cz8hm\" (UID: \"10b2c614-c160-4005-8f18-16b08e8d7e1a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:51.254711 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.254659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4q6\" (UniqueName: \"kubernetes.io/projected/10b2c614-c160-4005-8f18-16b08e8d7e1a-kube-api-access-2z4q6\") pod \"dns-operator-controller-manager-648d5c98bc-cz8hm\" (UID: \"10b2c614-c160-4005-8f18-16b08e8d7e1a\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:51.328670 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.328587 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:51.464272 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:51.464231 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm"] Apr 20 17:57:51.466616 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:57:51.466593 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10b2c614_c160_4005_8f18_16b08e8d7e1a.slice/crio-7a3fa686405bc76b87ca9cd52600e31a91b5c4aa831bef5b0c9dce8400392abd WatchSource:0}: Error finding container 7a3fa686405bc76b87ca9cd52600e31a91b5c4aa831bef5b0c9dce8400392abd: Status 404 returned error can't find the container with id 7a3fa686405bc76b87ca9cd52600e31a91b5c4aa831bef5b0c9dce8400392abd Apr 20 17:57:52.330978 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:52.330937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" event={"ID":"10b2c614-c160-4005-8f18-16b08e8d7e1a","Type":"ContainerStarted","Data":"7a3fa686405bc76b87ca9cd52600e31a91b5c4aa831bef5b0c9dce8400392abd"} Apr 20 17:57:54.339628 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.339537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" event={"ID":"10b2c614-c160-4005-8f18-16b08e8d7e1a","Type":"ContainerStarted","Data":"267500e346ef93a1a6ceceb8916cd98109475a248f14bdf63c666755b684c99a"} Apr 20 17:57:54.340046 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.339644 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:57:54.370793 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.370740 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" podStartSLOduration=1.941757639 podStartE2EDuration="4.37072487s" podCreationTimestamp="2026-04-20 17:57:50 +0000 UTC" firstStartedPulling="2026-04-20 17:57:51.468546943 +0000 UTC m=+562.561389371" lastFinishedPulling="2026-04-20 17:57:53.897514165 +0000 UTC m=+564.990356602" observedRunningTime="2026-04-20 17:57:54.367879776 +0000 UTC m=+565.460722226" watchObservedRunningTime="2026-04-20 17:57:54.37072487 +0000 UTC m=+565.463567319" Apr 20 17:57:54.530059 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.530020 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx"] Apr 20 17:57:54.533326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.533304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:54.536442 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.536425 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-qflm5\"" Apr 20 17:57:54.543640 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.543619 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx"] Apr 20 17:57:54.673330 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.673302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpz5v\" (UniqueName: \"kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v\") pod \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" (UID: \"eda5e9e1-c079-4e94-a381-0578f5151092\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:54.773827 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.773785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpz5v\" (UniqueName: \"kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v\") pod \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" (UID: \"eda5e9e1-c079-4e94-a381-0578f5151092\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:54.786213 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.786180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpz5v\" (UniqueName: \"kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v\") pod \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" (UID: \"eda5e9e1-c079-4e94-a381-0578f5151092\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:54.843293 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.843265 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:54.963497 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:54.963411 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx"] Apr 20 17:57:54.966230 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:57:54.966204 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeda5e9e1_c079_4e94_a381_0578f5151092.slice/crio-079cb980ef36359b0a2804967a62910bd504e9ba084d5f9c75cf039d94ae6fb0 WatchSource:0}: Error finding container 079cb980ef36359b0a2804967a62910bd504e9ba084d5f9c75cf039d94ae6fb0: Status 404 returned error can't find the container with id 079cb980ef36359b0a2804967a62910bd504e9ba084d5f9c75cf039d94ae6fb0 Apr 20 17:57:55.344803 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:55.344711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" event={"ID":"eda5e9e1-c079-4e94-a381-0578f5151092","Type":"ContainerStarted","Data":"079cb980ef36359b0a2804967a62910bd504e9ba084d5f9c75cf039d94ae6fb0"} Apr 20 17:57:57.353917 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:57.353820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" event={"ID":"eda5e9e1-c079-4e94-a381-0578f5151092","Type":"ContainerStarted","Data":"4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba"} Apr 20 17:57:57.354369 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:57.353977 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:57:57.382124 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:57:57.382077 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" podStartSLOduration=1.4154720219999999 podStartE2EDuration="3.38206352s" podCreationTimestamp="2026-04-20 17:57:54 +0000 UTC" firstStartedPulling="2026-04-20 17:57:54.968172967 +0000 UTC m=+566.061015396" lastFinishedPulling="2026-04-20 17:57:56.934764463 +0000 UTC m=+568.027606894" observedRunningTime="2026-04-20 17:57:57.379391642 +0000 UTC m=+568.472234102" watchObservedRunningTime="2026-04-20 17:57:57.38206352 +0000 UTC m=+568.474905969" Apr 20 17:58:05.347608 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.347578 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-cz8hm" Apr 20 17:58:05.670472 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.670445 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j"] Apr 20 17:58:05.674173 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.674154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.677182 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.677162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-7bk76\"" Apr 20 17:58:05.694182 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.694162 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j"] Apr 20 17:58:05.769005 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.768976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwf9m\" (UniqueName: \"kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.769124 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.769025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.869435 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.869401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.869579 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.869494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwf9m\" (UniqueName: \"kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.869816 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.869797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.899935 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.899909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwf9m\" (UniqueName: \"kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:05.983720 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:05.983642 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:06.109476 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:06.109450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j"] Apr 20 17:58:06.111575 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:58:06.111552 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b229062_8260_46d7_a66a_f662822bcf9f.slice/crio-e56bdd47d43da818184d7cbebd663871edd353aa9328eb1c4dac43b6d261856c WatchSource:0}: Error finding container e56bdd47d43da818184d7cbebd663871edd353aa9328eb1c4dac43b6d261856c: Status 404 returned error can't find the container with id e56bdd47d43da818184d7cbebd663871edd353aa9328eb1c4dac43b6d261856c Apr 20 17:58:06.385104 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:06.385068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" event={"ID":"8b229062-8260-46d7-a66a-f662822bcf9f","Type":"ContainerStarted","Data":"e56bdd47d43da818184d7cbebd663871edd353aa9328eb1c4dac43b6d261856c"} Apr 20 17:58:08.360532 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:08.359909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:58:12.407398 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:12.407361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" event={"ID":"8b229062-8260-46d7-a66a-f662822bcf9f","Type":"ContainerStarted","Data":"5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd"} Apr 20 17:58:12.407800 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:12.407533 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:12.433167 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:12.433107 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" podStartSLOduration=1.724085849 podStartE2EDuration="7.433089624s" podCreationTimestamp="2026-04-20 17:58:05 +0000 UTC" firstStartedPulling="2026-04-20 17:58:06.11404157 +0000 UTC m=+577.206883999" lastFinishedPulling="2026-04-20 17:58:11.823045345 +0000 UTC m=+582.915887774" observedRunningTime="2026-04-20 17:58:12.432378397 +0000 UTC m=+583.525220838" watchObservedRunningTime="2026-04-20 17:58:12.433089624 +0000 UTC m=+583.525932076" Apr 20 17:58:23.413422 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:23.413388 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:24.938839 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.938801 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j"] Apr 20 17:58:24.939252 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.939017 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" containerName="manager" containerID="cri-o://5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd" gracePeriod=2 Apr 20 17:58:24.941465 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.941420 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:24.942267 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.942243 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j"] Apr 20 17:58:24.965818 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.965699 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:24.966201 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.966181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" containerName="manager" Apr 20 17:58:24.966280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.966203 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" containerName="manager" Apr 20 17:58:24.966280 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.966277 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" containerName="manager" Apr 20 17:58:24.969208 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.969191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:24.972903 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.972882 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx"] Apr 20 17:58:24.973177 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.973134 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" containerName="manager" containerID="cri-o://4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba" gracePeriod=2 Apr 20 17:58:24.988359 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.988338 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx"] Apr 20 17:58:24.990350 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:24.990327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:25.002829 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.002658 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q"] Apr 20 17:58:25.003791 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.003772 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" containerName="manager" Apr 20 17:58:25.003891 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.003807 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" containerName="manager" Apr 20 17:58:25.003952 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.003932 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" containerName="manager" Apr 20 17:58:25.005099 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.005067 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.007013 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.006996 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:25.007210 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.007189 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.009151 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.009131 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.015854 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.015815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q"] Apr 20 17:58:25.025286 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.025199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmr8t\" (UniqueName: \"kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.025406 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.025304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975bg\" (UniqueName: \"kubernetes.io/projected/fa8d0810-db7b-4ba6-9877-cb965caab815-kube-api-access-975bg\") pod \"limitador-operator-controller-manager-85c4996f8c-c6n6q\" (UID: \"fa8d0810-db7b-4ba6-9877-cb965caab815\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:25.025406 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.025338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.027349 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.027317 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.029431 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.029407 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.126448 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.126388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-975bg\" (UniqueName: \"kubernetes.io/projected/fa8d0810-db7b-4ba6-9877-cb965caab815-kube-api-access-975bg\") pod \"limitador-operator-controller-manager-85c4996f8c-c6n6q\" (UID: \"fa8d0810-db7b-4ba6-9877-cb965caab815\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:25.126609 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.126462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.126609 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.126560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmr8t\" (UniqueName: \"kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.126993 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.126969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.136389 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.136339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-975bg\" (UniqueName: \"kubernetes.io/projected/fa8d0810-db7b-4ba6-9877-cb965caab815-kube-api-access-975bg\") pod \"limitador-operator-controller-manager-85c4996f8c-c6n6q\" (UID: \"fa8d0810-db7b-4ba6-9877-cb965caab815\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:25.136849 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.136819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmr8t\" (UniqueName: \"kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-6f89d\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.195036 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.194986 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:25.197296 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.197264 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.197864 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.197847 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:58:25.199150 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.199132 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.200919 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.200902 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.202702 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.202667 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.227143 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.227120 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume\") pod \"8b229062-8260-46d7-a66a-f662822bcf9f\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " Apr 20 17:58:25.227219 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.227182 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpz5v\" (UniqueName: \"kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v\") pod \"eda5e9e1-c079-4e94-a381-0578f5151092\" (UID: \"eda5e9e1-c079-4e94-a381-0578f5151092\") " Apr 20 17:58:25.227257 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.227225 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwf9m\" (UniqueName: \"kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m\") pod \"8b229062-8260-46d7-a66a-f662822bcf9f\" (UID: \"8b229062-8260-46d7-a66a-f662822bcf9f\") " Apr 20 17:58:25.227635 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.227612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8b229062-8260-46d7-a66a-f662822bcf9f" (UID: "8b229062-8260-46d7-a66a-f662822bcf9f"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:58:25.229169 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.229147 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m" (OuterVolumeSpecName: "kube-api-access-dwf9m") pod "8b229062-8260-46d7-a66a-f662822bcf9f" (UID: "8b229062-8260-46d7-a66a-f662822bcf9f"). InnerVolumeSpecName "kube-api-access-dwf9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:25.229169 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.229156 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v" (OuterVolumeSpecName: "kube-api-access-vpz5v") pod "eda5e9e1-c079-4e94-a381-0578f5151092" (UID: "eda5e9e1-c079-4e94-a381-0578f5151092"). InnerVolumeSpecName "kube-api-access-vpz5v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:25.328451 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.328418 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwf9m\" (UniqueName: \"kubernetes.io/projected/8b229062-8260-46d7-a66a-f662822bcf9f-kube-api-access-dwf9m\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:58:25.328451 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.328446 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8b229062-8260-46d7-a66a-f662822bcf9f-extensions-socket-volume\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:58:25.328451 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.328457 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpz5v\" (UniqueName: \"kubernetes.io/projected/eda5e9e1-c079-4e94-a381-0578f5151092-kube-api-access-vpz5v\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:58:25.368184 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.368155 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:25.373903 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.373886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:25.454597 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.454469 2575 generic.go:358] "Generic (PLEG): container finished" podID="8b229062-8260-46d7-a66a-f662822bcf9f" containerID="5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd" exitCode=0 Apr 20 17:58:25.454597 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.454517 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" Apr 20 17:58:25.454597 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.454560 2575 scope.go:117] "RemoveContainer" containerID="5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd" Apr 20 17:58:25.456089 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.456013 2575 generic.go:358] "Generic (PLEG): container finished" podID="eda5e9e1-c079-4e94-a381-0578f5151092" containerID="4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba" exitCode=0 Apr 20 17:58:25.456089 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.456067 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" Apr 20 17:58:25.457054 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.457015 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.460261 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.459992 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.462561 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.462501 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.464950 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.464876 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.469358 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.469316 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.472026 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.471788 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.472798 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.472205 2575 scope.go:117] "RemoveContainer" containerID="5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd" Apr 20 17:58:25.472892 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:58:25.472811 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd\": container with ID starting with 5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd not found: ID does not exist" containerID="5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd" Apr 20 17:58:25.472892 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.472844 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd"} err="failed to get container status \"5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd\": rpc error: code = NotFound desc = could not find container \"5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd\": container with ID starting with 5b34b2ea6fa9c895408e9d73940a801ae79880da3baf1c8d12a7f5ab1fa1cbbd not found: ID does not exist" Apr 20 17:58:25.472892 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.472867 2575 scope.go:117] "RemoveContainer" containerID="4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba" Apr 20 17:58:25.477370 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.477343 2575 status_manager.go:895] "Failed to get status for pod" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4b7lx" err="pods \"limitador-operator-controller-manager-85c4996f8c-4b7lx\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.479372 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.479352 2575 status_manager.go:895] "Failed to get status for pod" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-htk9j" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-htk9j\" is forbidden: User \"system:node:ip-10-0-137-82.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-82.ec2.internal' and this object" Apr 20 17:58:25.482861 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.482820 2575 scope.go:117] "RemoveContainer" containerID="4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba" Apr 20 17:58:25.483223 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:58:25.483191 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba\": container with ID starting with 4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba not found: ID does not exist" containerID="4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba" Apr 20 17:58:25.483279 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.483243 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba"} err="failed to get container status \"4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba\": rpc error: code = NotFound desc = could not find container \"4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba\": container with ID starting with 4663ff8b5592793351d77447988f74630c919f2dc29eccf548fa2bfa00ce6eba not found: ID does not exist" Apr 20 17:58:25.521075 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.521045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:25.522585 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:58:25.522560 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30b52323_99bf_46e7_8075_c34c02651a26.slice/crio-09c20c4a6d466d02eb30f5e3f379421f2d7418e1e3ae3f361e6c850776b3c936 WatchSource:0}: Error finding container 09c20c4a6d466d02eb30f5e3f379421f2d7418e1e3ae3f361e6c850776b3c936: Status 404 returned error can't find the container with id 09c20c4a6d466d02eb30f5e3f379421f2d7418e1e3ae3f361e6c850776b3c936 Apr 20 17:58:25.549646 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.549622 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b229062-8260-46d7-a66a-f662822bcf9f" path="/var/lib/kubelet/pods/8b229062-8260-46d7-a66a-f662822bcf9f/volumes" Apr 20 17:58:25.550101 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.550080 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda5e9e1-c079-4e94-a381-0578f5151092" path="/var/lib/kubelet/pods/eda5e9e1-c079-4e94-a381-0578f5151092/volumes" Apr 20 17:58:25.551948 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:25.551928 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q"] Apr 20 17:58:25.554709 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:58:25.554667 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa8d0810_db7b_4ba6_9877_cb965caab815.slice/crio-9bfdbfa7e0ba324daf3561e7edfd0731d762bcf8b5c5f77fad8939555724c899 WatchSource:0}: Error finding container 9bfdbfa7e0ba324daf3561e7edfd0731d762bcf8b5c5f77fad8939555724c899: Status 404 returned error can't find the container with id 9bfdbfa7e0ba324daf3561e7edfd0731d762bcf8b5c5f77fad8939555724c899 Apr 20 17:58:26.462125 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.462092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" event={"ID":"fa8d0810-db7b-4ba6-9877-cb965caab815","Type":"ContainerStarted","Data":"3e0c2de95bc1dbfde0c770198b8c5b9a5d479d4b8533b0dd970fd4eb883bc36d"} Apr 20 17:58:26.462125 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.462129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" event={"ID":"fa8d0810-db7b-4ba6-9877-cb965caab815","Type":"ContainerStarted","Data":"9bfdbfa7e0ba324daf3561e7edfd0731d762bcf8b5c5f77fad8939555724c899"} Apr 20 17:58:26.462599 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.462143 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:26.464175 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.464142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" event={"ID":"30b52323-99bf-46e7-8075-c34c02651a26","Type":"ContainerStarted","Data":"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8"} Apr 20 17:58:26.464266 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.464178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" event={"ID":"30b52323-99bf-46e7-8075-c34c02651a26","Type":"ContainerStarted","Data":"09c20c4a6d466d02eb30f5e3f379421f2d7418e1e3ae3f361e6c850776b3c936"} Apr 20 17:58:26.464312 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.464281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:26.481085 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.481036 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" podStartSLOduration=2.481020853 podStartE2EDuration="2.481020853s" podCreationTimestamp="2026-04-20 17:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:58:26.480572945 +0000 UTC m=+597.573415394" watchObservedRunningTime="2026-04-20 17:58:26.481020853 +0000 UTC m=+597.573863305" Apr 20 17:58:26.510107 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:26.510066 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" podStartSLOduration=2.510052178 podStartE2EDuration="2.510052178s" podCreationTimestamp="2026-04-20 17:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:58:26.508464598 +0000 UTC m=+597.601307048" watchObservedRunningTime="2026-04-20 17:58:26.510052178 +0000 UTC m=+597.602894627" Apr 20 17:58:29.437908 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:29.437866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:58:29.438993 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:29.438967 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 17:58:29.440332 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:29.440313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:58:29.441253 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:29.441235 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 17:58:37.470348 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:37.470317 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-c6n6q" Apr 20 17:58:37.470823 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:37.470376 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:40.615998 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.615968 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:40.616377 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.616190 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" podUID="30b52323-99bf-46e7-8075-c34c02651a26" containerName="manager" containerID="cri-o://b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8" gracePeriod=10 Apr 20 17:58:40.816978 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.816952 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh"] Apr 20 17:58:40.820422 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.820404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.832451 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.832429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh"] Apr 20 17:58:40.856525 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.856503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.856615 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.856536 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sttvm\" (UniqueName: \"kubernetes.io/projected/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-kube-api-access-sttvm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.867310 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.867267 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:40.957229 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957195 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmr8t\" (UniqueName: \"kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t\") pod \"30b52323-99bf-46e7-8075-c34c02651a26\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " Apr 20 17:58:40.957229 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957233 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume\") pod \"30b52323-99bf-46e7-8075-c34c02651a26\" (UID: \"30b52323-99bf-46e7-8075-c34c02651a26\") " Apr 20 17:58:40.957442 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.957492 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sttvm\" (UniqueName: \"kubernetes.io/projected/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-kube-api-access-sttvm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.957646 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957621 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "30b52323-99bf-46e7-8075-c34c02651a26" (UID: "30b52323-99bf-46e7-8075-c34c02651a26"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 17:58:40.957878 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.957862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:40.959326 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.959310 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t" (OuterVolumeSpecName: "kube-api-access-hmr8t") pod "30b52323-99bf-46e7-8075-c34c02651a26" (UID: "30b52323-99bf-46e7-8075-c34c02651a26"). InnerVolumeSpecName "kube-api-access-hmr8t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 17:58:40.969997 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:40.969973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sttvm\" (UniqueName: \"kubernetes.io/projected/804bd7b9-2c27-4c3f-b4dc-eca891b3722f-kube-api-access-sttvm\") pod \"kuadrant-operator-controller-manager-55c7f4c975-wc5dh\" (UID: \"804bd7b9-2c27-4c3f-b4dc-eca891b3722f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:41.057880 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.057847 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmr8t\" (UniqueName: \"kubernetes.io/projected/30b52323-99bf-46e7-8075-c34c02651a26-kube-api-access-hmr8t\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:58:41.057880 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.057875 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30b52323-99bf-46e7-8075-c34c02651a26-extensions-socket-volume\") on node \"ip-10-0-137-82.ec2.internal\" DevicePath \"\"" Apr 20 17:58:41.132401 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.132377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:41.253535 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.253493 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh"] Apr 20 17:58:41.256166 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:58:41.256134 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804bd7b9_2c27_4c3f_b4dc_eca891b3722f.slice/crio-70aea7543b3e1a8a86505da1b1bf7cdc7a8dc66eeb118863b51cf7d7f7739382 WatchSource:0}: Error finding container 70aea7543b3e1a8a86505da1b1bf7cdc7a8dc66eeb118863b51cf7d7f7739382: Status 404 returned error can't find the container with id 70aea7543b3e1a8a86505da1b1bf7cdc7a8dc66eeb118863b51cf7d7f7739382 Apr 20 17:58:41.516180 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.516083 2575 generic.go:358] "Generic (PLEG): container finished" podID="30b52323-99bf-46e7-8075-c34c02651a26" containerID="b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8" exitCode=0 Apr 20 17:58:41.516180 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.516140 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" Apr 20 17:58:41.516396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.516172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" event={"ID":"30b52323-99bf-46e7-8075-c34c02651a26","Type":"ContainerDied","Data":"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8"} Apr 20 17:58:41.516396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.516207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d" event={"ID":"30b52323-99bf-46e7-8075-c34c02651a26","Type":"ContainerDied","Data":"09c20c4a6d466d02eb30f5e3f379421f2d7418e1e3ae3f361e6c850776b3c936"} Apr 20 17:58:41.516396 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.516223 2575 scope.go:117] "RemoveContainer" containerID="b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8" Apr 20 17:58:41.517674 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.517646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" event={"ID":"804bd7b9-2c27-4c3f-b4dc-eca891b3722f","Type":"ContainerStarted","Data":"a612c44a1843a4e4e48a8379f9bd71f1e6ba2cdea56361e943e31b5abc8443cf"} Apr 20 17:58:41.517814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.517684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" event={"ID":"804bd7b9-2c27-4c3f-b4dc-eca891b3722f","Type":"ContainerStarted","Data":"70aea7543b3e1a8a86505da1b1bf7cdc7a8dc66eeb118863b51cf7d7f7739382"} Apr 20 17:58:41.517814 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.517811 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:58:41.524751 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.524734 2575 scope.go:117] "RemoveContainer" containerID="b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8" Apr 20 17:58:41.524990 ip-10-0-137-82 kubenswrapper[2575]: E0420 17:58:41.524973 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8\": container with ID starting with b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8 not found: ID does not exist" containerID="b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8" Apr 20 17:58:41.525058 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.525000 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8"} err="failed to get container status \"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8\": rpc error: code = NotFound desc = could not find container \"b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8\": container with ID starting with b24ba57bce7bb0b4b9bb0f48e89da4c5ccbe2595c9bafda43cd04bf7b509b1f8 not found: ID does not exist" Apr 20 17:58:41.551176 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.551129 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" podStartSLOduration=1.5511186110000001 podStartE2EDuration="1.551118611s" podCreationTimestamp="2026-04-20 17:58:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 17:58:41.548637194 +0000 UTC m=+612.641479643" watchObservedRunningTime="2026-04-20 17:58:41.551118611 +0000 UTC m=+612.643961061" Apr 20 17:58:41.568070 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.568046 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:41.575112 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:41.575094 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-6f89d"] Apr 20 17:58:43.548131 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:43.548098 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b52323-99bf-46e7-8075-c34c02651a26" path="/var/lib/kubelet/pods/30b52323-99bf-46e7-8075-c34c02651a26/volumes" Apr 20 17:58:52.524112 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:58:52.524035 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-wc5dh" Apr 20 17:59:10.970521 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.970489 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 17:59:10.971155 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.971076 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b52323-99bf-46e7-8075-c34c02651a26" containerName="manager" Apr 20 17:59:10.971155 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.971097 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b52323-99bf-46e7-8075-c34c02651a26" containerName="manager" Apr 20 17:59:10.971290 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.971228 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b52323-99bf-46e7-8075-c34c02651a26" containerName="manager" Apr 20 17:59:10.974511 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.974493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:10.976914 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.976894 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-tbdkf\"" Apr 20 17:59:10.977027 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.976980 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 17:59:10.982926 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:10.982906 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 17:59:11.010272 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.010245 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 17:59:11.098853 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.098821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1f22e3a9-bdb4-479d-9236-f1816717551c-config-file\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.099039 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.098895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm8r6\" (UniqueName: \"kubernetes.io/projected/1f22e3a9-bdb4-479d-9236-f1816717551c-kube-api-access-qm8r6\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.199600 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.199567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm8r6\" (UniqueName: \"kubernetes.io/projected/1f22e3a9-bdb4-479d-9236-f1816717551c-kube-api-access-qm8r6\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.199780 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.199646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1f22e3a9-bdb4-479d-9236-f1816717551c-config-file\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.200221 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.200203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1f22e3a9-bdb4-479d-9236-f1816717551c-config-file\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.207095 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.207077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm8r6\" (UniqueName: \"kubernetes.io/projected/1f22e3a9-bdb4-479d-9236-f1816717551c-kube-api-access-qm8r6\") pod \"limitador-limitador-78c99df468-qpz9p\" (UID: \"1f22e3a9-bdb4-479d-9236-f1816717551c\") " pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.286643 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.286556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:11.420244 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.420212 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 17:59:11.422764 ip-10-0-137-82 kubenswrapper[2575]: W0420 17:59:11.422734 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f22e3a9_bdb4_479d_9236_f1816717551c.slice/crio-4de2c1ecd00bcfc27ce74ec19012f9e52182083b336f357cdc88630b724d0691 WatchSource:0}: Error finding container 4de2c1ecd00bcfc27ce74ec19012f9e52182083b336f357cdc88630b724d0691: Status 404 returned error can't find the container with id 4de2c1ecd00bcfc27ce74ec19012f9e52182083b336f357cdc88630b724d0691 Apr 20 17:59:11.618445 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:11.618415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" event={"ID":"1f22e3a9-bdb4-479d-9236-f1816717551c","Type":"ContainerStarted","Data":"4de2c1ecd00bcfc27ce74ec19012f9e52182083b336f357cdc88630b724d0691"} Apr 20 17:59:14.629308 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:14.629275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" event={"ID":"1f22e3a9-bdb4-479d-9236-f1816717551c","Type":"ContainerStarted","Data":"f3dfe7d85a8a924643de005368fc9884542a6a11e9b396e831cdbb7e87f60437"} Apr 20 17:59:14.629708 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:14.629393 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 17:59:14.647312 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:14.647262 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" podStartSLOduration=2.121808511 podStartE2EDuration="4.647251724s" podCreationTimestamp="2026-04-20 17:59:10 +0000 UTC" firstStartedPulling="2026-04-20 17:59:11.424550641 +0000 UTC m=+642.517393069" lastFinishedPulling="2026-04-20 17:59:13.949993839 +0000 UTC m=+645.042836282" observedRunningTime="2026-04-20 17:59:14.64635623 +0000 UTC m=+645.739198682" watchObservedRunningTime="2026-04-20 17:59:14.647251724 +0000 UTC m=+645.740094174" Apr 20 17:59:25.633006 ip-10-0-137-82 kubenswrapper[2575]: I0420 17:59:25.632981 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-qpz9p" Apr 20 18:00:05.691295 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:00:05.691254 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:00:33.616438 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:00:33.616404 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:00:38.993992 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:00:38.993958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:01:04.999124 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:01:04.999088 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:01:08.685168 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:01:08.685132 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:01:13.493516 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:01:13.493482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:01:34.085844 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:01:34.085802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-qpz9p"] Apr 20 18:03:29.461702 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:29.461658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 18:03:29.462186 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:29.461773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 18:03:29.464043 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:29.464025 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 18:03:29.464177 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:29.464161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 18:03:39.974414 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:39.974388 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-t7jxp_cc572493-2a3e-4b9f-a55e-486e62f87313/manager/0.log" Apr 20 18:03:41.509158 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:41.509121 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-cz8hm_10b2c614-c160-4005-8f18-16b08e8d7e1a/manager/0.log" Apr 20 18:03:41.710383 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:41.710345 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-4p8k5_bd054fa6-d033-4a47-aeeb-71d4e3320397/registry-server/0.log" Apr 20 18:03:41.817418 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:41.817338 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-wc5dh_804bd7b9-2c27-4c3f-b4dc-eca891b3722f/manager/0.log" Apr 20 18:03:41.923489 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:41.923458 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-qpz9p_1f22e3a9-bdb4-479d-9236-f1816717551c/limitador/0.log" Apr 20 18:03:42.029447 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:42.029412 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-c6n6q_fa8d0810-db7b-4ba6-9877-cb965caab815/manager/0.log" Apr 20 18:03:42.344777 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:42.344748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm_7503f84f-1a4e-4263-8f8a-dfd88c876194/istio-proxy/0.log" Apr 20 18:03:42.868872 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:42.868843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bbd7cc87d-kgrp2_ca90dcc0-b07c-42a3-92a5-557d7ae40ea7/router/0.log" Apr 20 18:03:47.464739 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.464682 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dqzrr/must-gather-lgxbx"] Apr 20 18:03:47.468345 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.468321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.470884 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.470856 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"openshift-service-ca.crt\"" Apr 20 18:03:47.472199 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.472176 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dqzrr\"/\"default-dockercfg-mm66t\"" Apr 20 18:03:47.472199 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.472179 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dqzrr\"/\"kube-root-ca.crt\"" Apr 20 18:03:47.483150 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.483127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/must-gather-lgxbx"] Apr 20 18:03:47.575526 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.575497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcth\" (UniqueName: \"kubernetes.io/projected/b00bdcec-bd1a-4926-9e38-b441ca354078-kube-api-access-crcth\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.575713 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.575592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b00bdcec-bd1a-4926-9e38-b441ca354078-must-gather-output\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.676401 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.676375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crcth\" (UniqueName: \"kubernetes.io/projected/b00bdcec-bd1a-4926-9e38-b441ca354078-kube-api-access-crcth\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.676564 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.676435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b00bdcec-bd1a-4926-9e38-b441ca354078-must-gather-output\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.676802 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.676783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b00bdcec-bd1a-4926-9e38-b441ca354078-must-gather-output\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.689122 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.689093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcth\" (UniqueName: \"kubernetes.io/projected/b00bdcec-bd1a-4926-9e38-b441ca354078-kube-api-access-crcth\") pod \"must-gather-lgxbx\" (UID: \"b00bdcec-bd1a-4926-9e38-b441ca354078\") " pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.790800 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.790726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" Apr 20 18:03:47.911136 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.911109 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/must-gather-lgxbx"] Apr 20 18:03:47.913518 ip-10-0-137-82 kubenswrapper[2575]: W0420 18:03:47.913487 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00bdcec_bd1a_4926_9e38_b441ca354078.slice/crio-469f093003e189578bb96598550b5ec349dd71f546105a65f49a57fa7a38cf38 WatchSource:0}: Error finding container 469f093003e189578bb96598550b5ec349dd71f546105a65f49a57fa7a38cf38: Status 404 returned error can't find the container with id 469f093003e189578bb96598550b5ec349dd71f546105a65f49a57fa7a38cf38 Apr 20 18:03:47.915360 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:47.915341 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 18:03:48.563440 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:48.563406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" event={"ID":"b00bdcec-bd1a-4926-9e38-b441ca354078","Type":"ContainerStarted","Data":"469f093003e189578bb96598550b5ec349dd71f546105a65f49a57fa7a38cf38"} Apr 20 18:03:49.569097 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:49.569053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" event={"ID":"b00bdcec-bd1a-4926-9e38-b441ca354078","Type":"ContainerStarted","Data":"69481299535f4f6c16be945237ec36ded299120b0a467a7d01a23b234bf65792"} Apr 20 18:03:49.569097 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:49.569101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" event={"ID":"b00bdcec-bd1a-4926-9e38-b441ca354078","Type":"ContainerStarted","Data":"b75d2565c17f448d03937b60bd7a006b36e87acbbdbf883cfbd88781f0fdad12"} Apr 20 18:03:49.586851 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:49.586807 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dqzrr/must-gather-lgxbx" podStartSLOduration=1.7455981600000001 podStartE2EDuration="2.586791263s" podCreationTimestamp="2026-04-20 18:03:47 +0000 UTC" firstStartedPulling="2026-04-20 18:03:47.915528826 +0000 UTC m=+919.008371255" lastFinishedPulling="2026-04-20 18:03:48.756721926 +0000 UTC m=+919.849564358" observedRunningTime="2026-04-20 18:03:49.58565442 +0000 UTC m=+920.678496870" watchObservedRunningTime="2026-04-20 18:03:49.586791263 +0000 UTC m=+920.679633712" Apr 20 18:03:50.450863 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:50.450836 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pn6v5_95da1003-73aa-4420-b974-8a7d21bee404/global-pull-secret-syncer/0.log" Apr 20 18:03:50.537946 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:50.537913 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2mz86_d9d4b966-027c-4c5f-8280-4626703acbf5/konnectivity-agent/0.log" Apr 20 18:03:50.635623 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:50.635579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-82.ec2.internal_3e49c9e8f0fc73a695dc3fdeacc6ba89/haproxy/0.log" Apr 20 18:03:54.906471 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:54.906373 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-cz8hm_10b2c614-c160-4005-8f18-16b08e8d7e1a/manager/0.log" Apr 20 18:03:54.964299 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:54.964259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-4p8k5_bd054fa6-d033-4a47-aeeb-71d4e3320397/registry-server/0.log" Apr 20 18:03:55.013836 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:55.013801 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-wc5dh_804bd7b9-2c27-4c3f-b4dc-eca891b3722f/manager/0.log" Apr 20 18:03:55.035709 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:55.035232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-qpz9p_1f22e3a9-bdb4-479d-9236-f1816717551c/limitador/0.log" Apr 20 18:03:55.069718 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:55.069657 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-c6n6q_fa8d0810-db7b-4ba6-9877-cb965caab815/manager/0.log" Apr 20 18:03:56.592650 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.592574 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kvg82_33f6c414-d3f6-4ff7-b22e-e998f3e790bf/kube-state-metrics/0.log" Apr 20 18:03:56.621928 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.621891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kvg82_33f6c414-d3f6-4ff7-b22e-e998f3e790bf/kube-rbac-proxy-main/0.log" Apr 20 18:03:56.649153 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.649122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-kvg82_33f6c414-d3f6-4ff7-b22e-e998f3e790bf/kube-rbac-proxy-self/0.log" Apr 20 18:03:56.714888 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.714820 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-2brbm_22557b24-c8ee-4a18-949c-0102a598d5a9/monitoring-plugin/0.log" Apr 20 18:03:56.929258 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.929225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmh2c_ad44e0ba-0906-4c2e-a55b-edba31d4f5df/node-exporter/0.log" Apr 20 18:03:56.956451 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.956400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmh2c_ad44e0ba-0906-4c2e-a55b-edba31d4f5df/kube-rbac-proxy/0.log" Apr 20 18:03:56.993485 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:56.993462 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qmh2c_ad44e0ba-0906-4c2e-a55b-edba31d4f5df/init-textfile/0.log" Apr 20 18:03:57.104391 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.104332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/prometheus/0.log" Apr 20 18:03:57.125241 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.125213 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/config-reloader/0.log" Apr 20 18:03:57.150370 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.150344 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/thanos-sidecar/0.log" Apr 20 18:03:57.172393 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.172368 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/kube-rbac-proxy-web/0.log" Apr 20 18:03:57.194156 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.194091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/kube-rbac-proxy/0.log" Apr 20 18:03:57.216575 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.216548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/kube-rbac-proxy-thanos/0.log" Apr 20 18:03:57.246450 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.246428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3f2bc841-2a6c-4bd1-9ca7-445ea635dbe9/init-config-reloader/0.log" Apr 20 18:03:57.419680 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.419654 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/thanos-query/0.log" Apr 20 18:03:57.443715 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.443674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/kube-rbac-proxy-web/0.log" Apr 20 18:03:57.469649 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.469570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/kube-rbac-proxy/0.log" Apr 20 18:03:57.492386 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.492359 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/prom-label-proxy/0.log" Apr 20 18:03:57.513920 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.513896 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/kube-rbac-proxy-rules/0.log" Apr 20 18:03:57.535272 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:57.535246 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fc96f9886-8bm76_5659e814-9376-4138-a459-a38bc362b01b/kube-rbac-proxy-metrics/0.log" Apr 20 18:03:59.182021 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.181989 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn"] Apr 20 18:03:59.189051 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.189019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.195790 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.195761 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn"] Apr 20 18:03:59.198803 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.198778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/2.log" Apr 20 18:03:59.204766 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.204743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-x9jbt_7b844265-ed78-4d7b-ae2f-e0af244b29a2/console-operator/3.log" Apr 20 18:03:59.298159 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.298123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-sys\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.298402 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.298379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc85m\" (UniqueName: \"kubernetes.io/projected/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-kube-api-access-vc85m\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.298602 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.298586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-proc\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.298727 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.298712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-lib-modules\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.298879 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.298848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-podres\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399531 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-proc\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399531 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-lib-modules\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-podres\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-sys\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vc85m\" (UniqueName: \"kubernetes.io/projected/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-kube-api-access-vc85m\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-proc\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-lib-modules\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399797 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-sys\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.399993 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.399803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-podres\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.408418 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.408397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc85m\" (UniqueName: \"kubernetes.io/projected/d72a095f-62ad-42a2-97eb-245e4b7ce2bf-kube-api-access-vc85m\") pod \"perf-node-gather-daemonset-9v6pn\" (UID: \"d72a095f-62ad-42a2-97eb-245e4b7ce2bf\") " pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.503747 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.503263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:03:59.657310 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:03:59.657272 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn"] Apr 20 18:03:59.663078 ip-10-0-137-82 kubenswrapper[2575]: W0420 18:03:59.663036 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd72a095f_62ad_42a2_97eb_245e4b7ce2bf.slice/crio-e4462f8a4c2afa7461d1cf621d0d651a5880d12b64120826f1f3f2f7d39b0007 WatchSource:0}: Error finding container e4462f8a4c2afa7461d1cf621d0d651a5880d12b64120826f1f3f2f7d39b0007: Status 404 returned error can't find the container with id e4462f8a4c2afa7461d1cf621d0d651a5880d12b64120826f1f3f2f7d39b0007 Apr 20 18:04:00.237408 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:00.237377 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-9pk2t_9d65389c-a51a-4254-9236-b91567c36c0d/volume-data-source-validator/0.log" Apr 20 18:04:00.622447 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:00.622405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" event={"ID":"d72a095f-62ad-42a2-97eb-245e4b7ce2bf","Type":"ContainerStarted","Data":"d325a51a82c57f8225d8ca1fc2965b343ee00960305b8cd2f6fd7bf126216726"} Apr 20 18:04:00.623400 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:00.623368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" event={"ID":"d72a095f-62ad-42a2-97eb-245e4b7ce2bf","Type":"ContainerStarted","Data":"e4462f8a4c2afa7461d1cf621d0d651a5880d12b64120826f1f3f2f7d39b0007"} Apr 20 18:04:00.623547 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:00.623404 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:04:00.640146 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:00.640096 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" podStartSLOduration=1.640079576 podStartE2EDuration="1.640079576s" podCreationTimestamp="2026-04-20 18:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 18:04:00.63789005 +0000 UTC m=+931.730732495" watchObservedRunningTime="2026-04-20 18:04:00.640079576 +0000 UTC m=+931.732922025" Apr 20 18:04:01.041673 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:01.041575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6kmcw_4dc154c0-3907-4457-b37d-d6e899b39fff/dns/0.log" Apr 20 18:04:01.064974 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:01.064948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6kmcw_4dc154c0-3907-4457-b37d-d6e899b39fff/kube-rbac-proxy/0.log" Apr 20 18:04:01.180502 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:01.180474 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5gv96_70b177d7-2721-4fb0-85f6-0e4da108fdaf/dns-node-resolver/0.log" Apr 20 18:04:01.676917 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:01.676885 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-985f6489d-cxh24_3d171c0a-dae9-40e6-afaf-3a8d5eb4a6e4/registry/0.log" Apr 20 18:04:01.742938 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:01.742898 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xgfmk_799ea967-f6fc-4097-8bbf-38dcdeaa4107/node-ca/0.log" Apr 20 18:04:02.540827 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:02.540795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfngcgm_7503f84f-1a4e-4263-8f8a-dfd88c876194/istio-proxy/0.log" Apr 20 18:04:02.689748 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:02.689678 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7bbd7cc87d-kgrp2_ca90dcc0-b07c-42a3-92a5-557d7ae40ea7/router/0.log" Apr 20 18:04:03.272570 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:03.272527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kj29z_ff21a6d9-bc28-45e4-96ea-9f89c9bc1ce3/serve-healthcheck-canary/0.log" Apr 20 18:04:03.906770 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:03.906718 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bjjc_b3c0232c-d2c7-4e1e-bf28-9713d04a4895/kube-rbac-proxy/0.log" Apr 20 18:04:03.926897 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:03.926855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bjjc_b3c0232c-d2c7-4e1e-bf28-9713d04a4895/exporter/0.log" Apr 20 18:04:03.947995 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:03.947970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4bjjc_b3c0232c-d2c7-4e1e-bf28-9713d04a4895/extractor/0.log" Apr 20 18:04:06.024679 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:06.024649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-b8c4c7886-t7jxp_cc572493-2a3e-4b9f-a55e-486e62f87313/manager/0.log" Apr 20 18:04:06.639631 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:06.639605 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dqzrr/perf-node-gather-daemonset-9v6pn" Apr 20 18:04:07.181779 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:07.181748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7589d7b74d-w54d2_92158631-16a6-4817-9a0c-0e7b403207e7/manager/0.log" Apr 20 18:04:11.653397 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:11.653367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-444hg_7cf1c858-dc4d-4fc1-8868-b98320883062/migrator/0.log" Apr 20 18:04:11.678344 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:11.678318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-444hg_7cf1c858-dc4d-4fc1-8868-b98320883062/graceful-termination/0.log" Apr 20 18:04:12.037947 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:12.037863 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wtp2c_7ac7cee7-8495-490d-92f9-c31987536747/kube-storage-version-migrator-operator/1.log" Apr 20 18:04:12.039072 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:12.039044 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wtp2c_7ac7cee7-8495-490d-92f9-c31987536747/kube-storage-version-migrator-operator/0.log" Apr 20 18:04:13.067735 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.067679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4cnfn_573305bc-ab90-4807-aab8-65f52ffaf213/kube-multus/0.log" Apr 20 18:04:13.126777 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.126753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/kube-multus-additional-cni-plugins/0.log" Apr 20 18:04:13.150391 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.150367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/egress-router-binary-copy/0.log" Apr 20 18:04:13.172031 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.172008 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/cni-plugins/0.log" Apr 20 18:04:13.200908 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.200886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/bond-cni-plugin/0.log" Apr 20 18:04:13.223557 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.223539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/routeoverride-cni/0.log" Apr 20 18:04:13.245824 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.245807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/whereabouts-cni-bincopy/0.log" Apr 20 18:04:13.267460 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.267439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9jtw5_f4675b7e-7738-4100-be48-af94288931f3/whereabouts-cni/0.log" Apr 20 18:04:13.752150 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.752121 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rlsjc_07e151c2-7294-492d-b56b-1fc480d9ab69/network-metrics-daemon/0.log" Apr 20 18:04:13.773706 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:13.773657 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rlsjc_07e151c2-7294-492d-b56b-1fc480d9ab69/kube-rbac-proxy/0.log" Apr 20 18:04:15.240185 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.240122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-controller/0.log" Apr 20 18:04:15.261248 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.261217 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/0.log" Apr 20 18:04:15.265615 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.265594 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovn-acl-logging/1.log" Apr 20 18:04:15.286131 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.286113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/kube-rbac-proxy-node/0.log" Apr 20 18:04:15.307520 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.307499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 18:04:15.325291 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.325270 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/northd/0.log" Apr 20 18:04:15.347095 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.347073 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/nbdb/0.log" Apr 20 18:04:15.367088 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.367066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/sbdb/0.log" Apr 20 18:04:15.474118 ip-10-0-137-82 kubenswrapper[2575]: I0420 18:04:15.474087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nfzpp_b17266e6-f66d-4b28-88a9-100b2da4666a/ovnkube-controller/0.log"