Apr 22 15:08:37.096475 ip-10-0-137-228 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:08:37.547933 ip-10-0-137-228 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:37.547933 ip-10-0-137-228 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:08:37.547933 ip-10-0-137-228 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:37.547933 ip-10-0-137-228 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:08:37.547933 ip-10-0-137-228 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:08:37.551103 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.551010 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:08:37.553302 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553288 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:37.553302 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553302 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553307 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553310 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553313 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553316 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553319 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553321 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553324 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553327 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553329 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553332 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553335 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553337 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553340 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553342 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553349 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553352 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553355 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553358 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:37.553369 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553360 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553363 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553365 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553368 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553370 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553373 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553376 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553378 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553381 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553384 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553387 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553390 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553392 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553395 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553398 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553400 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553402 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553405 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553407 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553410 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:37.553820 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553412 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553415 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553417 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553421 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553423 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553426 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553429 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553432 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553434 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553436 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553439 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553441 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553444 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553446 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553450 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553454 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553458 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553461 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553463 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:37.554323 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553466 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553469 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553471 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553474 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553477 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553479 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553482 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553485 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553488 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553490 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553493 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553496 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553498 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553501 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553503 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553506 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553508 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553514 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553517 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:37.554793 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553522 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553527 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553531 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553534 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553537 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553540 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553542 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553545 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553911 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553916 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553919 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553922 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553925 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553927 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553930 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553933 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553935 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553938 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553941 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553944 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553949 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:37.555262 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553952 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553954 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553957 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553960 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553962 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553965 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553967 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553969 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553972 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553974 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553977 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553980 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553982 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553985 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553989 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553992 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553995 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.553997 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554000 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:37.555770 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554003 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554006 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554009 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554012 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554014 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554017 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554020 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554022 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554025 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554027 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554029 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554032 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554034 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554038 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554042 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554045 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554049 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554051 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554054 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:37.556250 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554057 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554060 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554063 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554065 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554068 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554073 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554075 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554078 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554081 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554084 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554086 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554089 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554091 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554094 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554097 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554099 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554102 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554105 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554107 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554110 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:37.556706 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554112 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554115 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554117 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554120 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554123 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554126 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554128 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554131 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554134 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554136 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554138 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554141 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554143 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554146 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.554148 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555556 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555573 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555579 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555584 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555589 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555592 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:08:37.557325 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555597 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555602 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555605 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555609 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555612 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555616 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555619 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555622 2577 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555625 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555628 2577 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555631 2577 flags.go:64] FLAG: --cloud-config="" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555633 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555636 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555641 2577 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555644 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555647 2577 flags.go:64] FLAG: --config-dir="" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555650 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555653 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555657 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555660 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555664 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555667 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555670 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555673 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:08:37.557844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555676 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555679 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555682 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555686 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555690 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555693 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555696 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555699 2577 flags.go:64] FLAG: --enable-server="true" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555701 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555705 2577 flags.go:64] FLAG: --event-burst="100" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555709 2577 flags.go:64] FLAG: --event-qps="50" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555712 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555715 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555718 2577 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555722 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555725 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555728 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555731 2577 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555734 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555737 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555740 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555742 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555745 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555748 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555751 2577 flags.go:64] FLAG: --feature-gates="" Apr 22 15:08:37.558457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555755 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555758 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555761 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555764 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555767 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555771 2577 flags.go:64] FLAG: --help="false" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555774 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555777 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555781 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555784 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555787 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555791 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555794 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555797 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555800 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555803 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555806 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555809 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555811 2577 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555814 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555817 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555820 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555823 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555826 2577 flags.go:64] FLAG: --lock-file="" Apr 22 15:08:37.559080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555829 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555831 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555834 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555839 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555842 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555845 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555848 2577 flags.go:64] FLAG: --logging-format="text" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555850 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555854 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555856 2577 flags.go:64] FLAG: --manifest-url="" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555859 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555863 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555867 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555871 2577 flags.go:64] FLAG: --max-pods="110" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555874 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555877 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555880 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555883 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555888 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555891 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555894 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555901 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555904 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555907 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:08:37.559709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555910 2577 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555913 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555918 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555921 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555925 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555927 2577 flags.go:64] FLAG: --port="10250" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555930 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555933 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-068ec579b69d25166" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555936 2577 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555939 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555943 2577 flags.go:64] FLAG: --register-node="true" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555946 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555949 2577 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555953 2577 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555956 2577 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555959 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555962 2577 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555965 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555968 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555971 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555974 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555977 2577 flags.go:64] FLAG: --runonce="false" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555980 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555983 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555986 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:08:37.560315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555989 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555993 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555996 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.555999 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556002 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556006 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556009 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556012 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556014 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556017 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556020 2577 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556023 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556028 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556031 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556034 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556576 2577 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556580 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556583 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556586 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556589 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556592 2577 flags.go:64] FLAG: --v="2" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556596 2577 flags.go:64] FLAG: --version="false" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556600 2577 flags.go:64] FLAG: --vmodule="" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556604 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.556608 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:08:37.560915 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556702 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556705 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556708 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556711 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556715 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556718 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556721 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556724 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556727 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556730 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556732 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556735 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556738 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556740 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556743 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556745 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556748 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556750 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556753 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556755 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:37.561576 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556757 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556760 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556762 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556765 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556768 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556771 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556773 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556776 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556778 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556781 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556783 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556786 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556788 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556791 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556793 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556796 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556798 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556801 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556804 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556806 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:37.562080 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556809 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556812 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556814 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556817 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556823 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556825 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556828 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556830 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556833 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556835 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556838 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556840 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556843 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556845 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556848 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556852 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556855 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556857 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556860 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556863 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:37.562636 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556865 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556867 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556870 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556872 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556875 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556877 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556880 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556882 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556885 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556887 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556890 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556893 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556895 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556898 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556901 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556904 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556908 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556911 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556914 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:37.563117 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556918 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556921 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556924 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556927 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556930 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556933 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.556936 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:37.563605 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.557572 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:37.563812 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.563793 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:08:37.563846 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.563813 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563859 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563865 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563868 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563871 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563874 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563877 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:37.563876 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563880 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563883 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563885 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563888 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563891 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563894 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563896 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563899 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563901 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563904 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563906 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563909 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563911 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563914 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563917 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563920 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563922 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563925 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563928 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563930 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:37.564050 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563933 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563935 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563938 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563940 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563943 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563947 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563949 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563953 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563957 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563961 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563964 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563967 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563969 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563973 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563976 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563979 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563982 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563985 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563988 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:37.564606 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563991 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563993 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563996 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.563999 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564001 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564004 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564006 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564009 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564011 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564014 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564016 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564019 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564022 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564024 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564027 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564030 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564032 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564035 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564039 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564043 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:37.565075 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564046 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564049 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564051 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564054 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564057 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564060 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564063 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564066 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564068 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564071 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564073 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564076 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564078 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564081 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564084 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564086 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564089 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564091 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564094 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:37.565582 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564097 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564099 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.564105 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564236 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564242 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564245 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564248 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564251 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564254 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564257 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564260 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564262 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564265 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564268 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564271 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564273 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:08:37.566088 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564276 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564278 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564281 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564284 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564287 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564289 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564292 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564295 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564297 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564300 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564302 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564305 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564307 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564310 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564313 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564315 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564317 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564320 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564323 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564325 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:08:37.566508 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564327 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564331 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564333 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564336 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564338 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564342 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564346 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564349 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564351 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564354 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564357 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564359 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564362 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564364 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564367 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564370 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564372 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564375 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564377 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:08:37.566998 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564380 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564382 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564385 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564388 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564390 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564393 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564395 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564398 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564400 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564403 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564405 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564408 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564412 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564415 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564418 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564421 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564424 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564427 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564430 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:08:37.567476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564432 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564435 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564437 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564440 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564442 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564445 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564448 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564450 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564453 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564456 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564458 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564461 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564463 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564466 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:37.564468 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.564472 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:08:37.567935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.565134 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:08:37.568360 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.567100 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:08:37.568360 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.568226 2577 server.go:1019] "Starting client certificate rotation" Apr 22 15:08:37.568360 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.568318 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:37.569214 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.569202 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:08:37.599623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.599595 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:37.602114 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.602093 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:08:37.621782 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.621759 2577 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:08:37.627250 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.627227 2577 log.go:25] "Validated CRI v1 image API" Apr 22 15:08:37.628336 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.628319 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:37.628572 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.628546 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:08:37.633138 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.633109 2577 fs.go:135] Filesystem UUIDs: map[2f508dcb-750f-4389-9aad-c1eb65568dc7:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8930fe95-147b-42a2-811b-0c3409f82e46:/dev/nvme0n1p4] Apr 22 15:08:37.633189 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.633138 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:08:37.638468 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.638355 2577 manager.go:217] Machine: {Timestamp:2026-04-22 15:08:37.637264633 +0000 UTC m=+0.423901688 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099782 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec222893f73dc98878b85c61f7d3c4ab SystemUUID:ec222893-f73d-c988-78b8-5c61f7d3c4ab BootID:aca86896-4416-4b42-ba4a-39f7062115d1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:38:90:d3:4e:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:38:90:d3:4e:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:92:5f:a8:fa:c7:cb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:08:37.638468 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.638464 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:08:37.638578 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.638566 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:08:37.640276 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640252 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:08:37.640410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640278 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-228.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:08:37.640452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640421 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:08:37.640452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640429 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:08:37.640452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640442 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:37.640535 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.640459 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:08:37.641864 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.641854 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:37.641972 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.641963 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:08:37.644270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.644261 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:08:37.644310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.644279 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:08:37.644310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.644297 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:08:37.644310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.644307 2577 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:08:37.644410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.644319 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:08:37.645382 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.645371 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:37.645427 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.645389 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:08:37.649158 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.649086 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:08:37.651179 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.651161 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:08:37.652856 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652844 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652861 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652868 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652873 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652879 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652884 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652890 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652896 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:08:37.652902 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652902 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:08:37.653117 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652908 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:08:37.653117 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652917 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:08:37.653117 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.652925 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:08:37.653726 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.653715 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:08:37.653762 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.653730 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:08:37.656734 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.656713 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-228.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 15:08:37.657394 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.657381 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:08:37.657441 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.657416 2577 server.go:1295] "Started kubelet" Apr 22 15:08:37.657493 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.657468 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-228.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:08:37.657493 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.657477 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:08:37.657585 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.657544 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:08:37.657618 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.657568 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:08:37.657647 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.657636 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:08:37.658235 ip-10-0-137-228 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:08:37.658795 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.658756 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:08:37.660281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.660266 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:08:37.664730 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.664713 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qmxkj" Apr 22 15:08:37.666006 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.665794 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:37.666322 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.666304 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:08:37.667014 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.666997 2577 factory.go:55] Registering systemd factory Apr 22 15:08:37.667169 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667158 2577 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:08:37.667351 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.666132 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-228.ec2.internal.18a8b65250d55bea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-228.ec2.internal,UID:ip-10-0-137-228.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-228.ec2.internal,},FirstTimestamp:2026-04-22 15:08:37.657394154 +0000 UTC m=+0.444031205,LastTimestamp:2026-04-22 15:08:37.657394154 +0000 UTC m=+0.444031205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-228.ec2.internal,}" Apr 22 15:08:37.667522 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667505 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:08:37.667595 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667509 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:08:37.667652 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667600 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:08:37.667745 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667730 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:08:37.667800 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667746 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:08:37.667846 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667828 2577 factory.go:153] Registering CRI-O factory Apr 22 15:08:37.667846 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667845 2577 factory.go:223] Registration of the crio container factory successfully Apr 22 15:08:37.667959 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667930 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:08:37.668058 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667968 2577 factory.go:103] Registering Raw factory Apr 22 15:08:37.668058 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.667990 2577 manager.go:1196] Started watching for new ooms in manager Apr 22 15:08:37.668058 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.668019 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:37.668394 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.668342 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:08:37.668457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.668430 2577 manager.go:319] Starting recovery of all containers Apr 22 15:08:37.672366 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.672340 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 15:08:37.672526 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.672463 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-228.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 15:08:37.679253 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.679228 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qmxkj" Apr 22 15:08:37.681289 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.681274 2577 manager.go:324] Recovery completed Apr 22 15:08:37.685010 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.684993 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.687473 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.687456 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.687547 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.687490 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.687547 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.687505 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.687995 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.687981 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:08:37.688047 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.687996 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:08:37.688047 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.688012 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:08:37.689715 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.689658 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-228.ec2.internal.18a8b65252a05059 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-228.ec2.internal,UID:ip-10-0-137-228.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-228.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-228.ec2.internal,},FirstTimestamp:2026-04-22 15:08:37.687472217 +0000 UTC m=+0.474109272,LastTimestamp:2026-04-22 15:08:37.687472217 +0000 UTC m=+0.474109272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-228.ec2.internal,}" Apr 22 15:08:37.690709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.690697 2577 policy_none.go:49] "None policy: Start" Apr 22 15:08:37.690759 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.690713 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:08:37.690759 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.690723 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:08:37.724796 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.724782 2577 manager.go:341] "Starting Device Plugin manager" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.724811 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.724820 2577 server.go:85] "Starting device plugin registration server" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.725011 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.725024 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.725123 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.725213 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.725222 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.725718 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:08:37.733867 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.725761 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:37.797460 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.797429 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:08:37.798762 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.798711 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:08:37.798762 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.798736 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:08:37.798762 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.798755 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:08:37.798942 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.798764 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:08:37.798942 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.798796 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:08:37.803492 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.803473 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:37.825474 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.825461 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.826343 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.826330 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.826410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.826358 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.826410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.826369 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.826410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.826405 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.834891 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.834877 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.834948 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.834901 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-228.ec2.internal\": node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:37.855172 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.855152 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:37.898905 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.898882 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal"] Apr 22 15:08:37.898985 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.898941 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.901017 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.900991 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.901122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.901024 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.901122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.901038 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.902241 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902227 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.902394 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.902439 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902409 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.902888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902867 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.902978 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902898 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.902978 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902912 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.902978 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902871 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.902978 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902980 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.903165 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.902991 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.904188 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.904172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.904250 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.904216 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:08:37.904790 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.904776 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:08:37.904844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.904803 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:08:37.904844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.904818 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:08:37.927721 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.927702 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-228.ec2.internal\" not found" node="ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.931944 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.931928 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-228.ec2.internal\" not found" node="ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.955637 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:37.955615 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:37.970068 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.970047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.970139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.970072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:37.970139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:37.970090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f00231807412dde1ca296d35e880b1b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-228.ec2.internal\" (UID: \"4f00231807412dde1ca296d35e880b1b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.055993 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.055939 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.070789 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.070867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.070867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f00231807412dde1ca296d35e880b1b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-228.ec2.internal\" (UID: \"4f00231807412dde1ca296d35e880b1b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.070867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4f00231807412dde1ca296d35e880b1b-config\") pod \"kube-apiserver-proxy-ip-10-0-137-228.ec2.internal\" (UID: \"4f00231807412dde1ca296d35e880b1b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.070867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.071006 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.070858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/514e0b046b2478b20bc5c0831a3ea228-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal\" (UID: \"514e0b046b2478b20bc5c0831a3ea228\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.156396 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.156371 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.229878 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.229846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.234457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.234442 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.257209 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.257170 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.357777 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.357707 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.458302 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.458265 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.558996 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.558968 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.568258 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.568233 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:08:38.568381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.568367 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:08:38.659656 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.659598 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.666798 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.666783 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:08:38.681523 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.681493 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:03:37 +0000 UTC" deadline="2027-11-23 07:08:01.517716897 +0000 UTC" Apr 22 15:08:38.681523 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.681516 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13911h59m22.836203387s" Apr 22 15:08:38.684061 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.684044 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:08:38.714625 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.714607 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2chsf" Apr 22 15:08:38.721719 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.721699 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2chsf" Apr 22 15:08:38.759713 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.759695 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.860823 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:38.860791 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-228.ec2.internal\" not found" Apr 22 15:08:38.899339 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:38.899307 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod514e0b046b2478b20bc5c0831a3ea228.slice/crio-5a410ff0b2f24accb29c4d7d5a3ea06bcf98255846a578f278574a62030ef62d WatchSource:0}: Error finding container 5a410ff0b2f24accb29c4d7d5a3ea06bcf98255846a578f278574a62030ef62d: Status 404 returned error can't find the container with id 5a410ff0b2f24accb29c4d7d5a3ea06bcf98255846a578f278574a62030ef62d Apr 22 15:08:38.899937 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:38.899914 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f00231807412dde1ca296d35e880b1b.slice/crio-2bf07822d6444ee5971576efe02658c9ea2186cd1a2ddab961cc10a0745f83e5 WatchSource:0}: Error finding container 2bf07822d6444ee5971576efe02658c9ea2186cd1a2ddab961cc10a0745f83e5: Status 404 returned error can't find the container with id 2bf07822d6444ee5971576efe02658c9ea2186cd1a2ddab961cc10a0745f83e5 Apr 22 15:08:38.903134 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.903118 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:38.904410 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.904387 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:08:38.967570 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.967514 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.982839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.982820 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:38.984582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.984570 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" Apr 22 15:08:38.995608 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:38.995593 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:08:39.007628 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.007609 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:39.135510 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.135485 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:39.417428 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.417353 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:08:39.645888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.645630 2577 apiserver.go:52] "Watching apiserver" Apr 22 15:08:39.655063 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.654886 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:08:39.656058 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.656028 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vrpzt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal","openshift-multus/network-metrics-daemon-9nk69","openshift-network-diagnostics/network-check-target-bvrrk","kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal","openshift-multus/multus-94cqk","openshift-multus/multus-additional-cni-plugins-sg7kx","openshift-network-operator/iptables-alerter-p49tx","openshift-ovn-kubernetes/ovnkube-node-42xf8","kube-system/konnectivity-agent-p49zs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z","openshift-cluster-node-tuning-operator/tuned-s5jjg","openshift-dns/node-resolver-4sxvb"] Apr 22 15:08:39.657739 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.657717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.660482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.660461 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.660582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.660529 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.660640 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.660587 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:08:39.660735 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.660585 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:39.661374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.661041 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.661374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.661222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w64wz\"" Apr 22 15:08:39.661374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.661274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.661936 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.661917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.664437 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.664307 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.665938 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.665254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.668987 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.667231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.668987 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.668593 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.670104 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.670083 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.671942 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.671636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:39.671942 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.671694 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:39.672805 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.672788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.673137 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.672905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.674925 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.674866 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:08:39.675017 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.674968 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.675078 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.675034 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:08:39.675130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.674866 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.675267 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.675238 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:08:39.675525 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.675504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.675725 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.675709 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.676100 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:08:39.676265 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-42w7s\"" Apr 22 15:08:39.676343 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676313 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:08:39.676580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676564 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.676580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676574 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:08:39.676905 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676866 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dmdzt\"" Apr 22 15:08:39.676991 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.676917 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:08:39.682703 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.682663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-run\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-bin\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d075efdc-d5f5-490a-a543-09e52a1f9e38-host\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-systemd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683878 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.682836 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.684215 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:08:39.684582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.684298 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5g8cc\"" Apr 22 15:08:39.684582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683618 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-82vnw\"" Apr 22 15:08:39.684582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.684548 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.684726 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.684717 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:08:39.684894 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.684880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.685108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:08:39.685227 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685189 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b2fpn\"" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.683884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-node-log\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685662 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-g4ghc\"" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-modprobe-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-systemd\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-systemd-units\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685876 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-sys\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685946 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcnx\" (UniqueName: \"kubernetes.io/projected/ab887213-98f0-4051-a99d-d23453b1ec24-kube-api-access-7tcnx\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.685991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-env-overrides\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thk7t\" (UniqueName: \"kubernetes.io/projected/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-kube-api-access-thk7t\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-hostroot\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-registration-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686186 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:08:39.686372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-k8s-cni-cncf-io\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.687352 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-bin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.687352 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.686346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgvs4\" (UniqueName: \"kubernetes.io/projected/d075efdc-d5f5-490a-a543-09e52a1f9e38-kube-api-access-qgvs4\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.687352 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.687123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:08:39.688170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.688149 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-txldn\"" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.688228 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cg6jl\"" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.688263 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.688457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsnhd\" (UniqueName: \"kubernetes.io/projected/daf221a4-075f-4ecb-83fb-afb1b4d25997-kube-api-access-dsnhd\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-etc-tuned\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-sys-fs\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689081 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cni-binary-copy\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-tmp\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689158 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cnibin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-socket-dir-parent\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689220 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-device-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-kubernetes\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-conf\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-lib-modules\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-ovn\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689335 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-netns\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d075efdc-d5f5-490a-a543-09e52a1f9e38-serviceca\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.690245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-binary-copy\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovn-node-metrics-cert\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689484 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-etc-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-log-socket\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-var-lib-kubelet\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689555 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-kubelet\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-var-lib-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689602 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-conf-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689625 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-daemon-config\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx2l5\" (UniqueName: \"kubernetes.io/projected/83a4c543-c34e-4cca-b476-71845b4617e3-kube-api-access-wx2l5\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689704 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689731 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-etc-kubernetes\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmfm\" (UniqueName: \"kubernetes.io/projected/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-kube-api-access-4vmfm\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-cnibin\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691118 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-os-release\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-host-slash\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2djj\" (UniqueName: \"kubernetes.io/projected/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-kube-api-access-k2djj\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689905 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-netd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-script-lib\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689954 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d525\" (UniqueName: \"kubernetes.io/projected/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-kube-api-access-5d525\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.689977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-multus\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-kubelet\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-multus-certs\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-iptables-alerter-script\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-config\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690127 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-system-cni-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690163 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-socket-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysconfig\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.691882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-host\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.692641 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-slash\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.692641 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-netns\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.692641 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-system-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.692641 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.690359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-os-release\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.723941 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.723892 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:38 +0000 UTC" deadline="2027-11-30 03:24:36.698577696 +0000 UTC" Apr 22 15:08:39.723941 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.723920 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14076h15m56.974661252s" Apr 22 15:08:39.769249 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.769222 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:08:39.791051 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-env-overrides\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791063 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thk7t\" (UniqueName: \"kubernetes.io/projected/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-kube-api-access-thk7t\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-hostroot\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-registration-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-k8s-cni-cncf-io\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-bin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgvs4\" (UniqueName: \"kubernetes.io/projected/d075efdc-d5f5-490a-a543-09e52a1f9e38-kube-api-access-qgvs4\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsnhd\" (UniqueName: \"kubernetes.io/projected/daf221a4-075f-4ecb-83fb-afb1b4d25997-kube-api-access-dsnhd\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-etc-tuned\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-sys-fs\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791383 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cni-binary-copy\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-k8s-cni-cncf-io\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-tmp\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791447 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cnibin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-socket-dir-parent\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-device-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-kubernetes\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791533 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-hostroot\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-conf\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-registration-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-env-overrides\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-lib-modules\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-ovn\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-conf\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-netns\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d075efdc-d5f5-490a-a543-09e52a1f9e38-serviceca\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-binary-copy\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.791871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791803 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-bin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovn-node-metrics-cert\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c6e1747-9197-468a-b61e-0e687eab6eaa-agent-certs\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-tmp-dir\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-socket-dir-parent\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhnz6\" (UniqueName: \"kubernetes.io/projected/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-kube-api-access-mhnz6\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-etc-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.791985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-log-socket\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-var-lib-kubelet\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-kubelet\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-var-lib-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-conf-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-daemon-config\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-sys-fs\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792185 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-device-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cni-binary-copy\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.792651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-cnibin\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792329 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-kubernetes\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysctl-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-var-lib-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792583 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d075efdc-d5f5-490a-a543-09e52a1f9e38-serviceca\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-lib-modules\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-ovn\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-netns\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-etc-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792797 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-conf-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-var-lib-kubelet\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.792846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-log-socket\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793082 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-kubelet\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.793188 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx2l5\" (UniqueName: \"kubernetes.io/projected/83a4c543-c34e-4cca-b476-71845b4617e3-kube-api-access-wx2l5\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.793298 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:08:40.293251492 +0000 UTC m=+3.079888544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.793580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-etc-kubernetes\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmfm\" (UniqueName: \"kubernetes.io/projected/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-kube-api-access-4vmfm\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-cnibin\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-os-release\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-daemon-config\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-multus-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-openvswitch\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-cnibin\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-host-slash\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-binary-copy\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-etc-kubernetes\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-host-slash\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2djj\" (UniqueName: \"kubernetes.io/projected/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-kube-api-access-k2djj\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-netd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-script-lib\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d525\" (UniqueName: \"kubernetes.io/projected/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-kube-api-access-5d525\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.794516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793791 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-multus\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-kubelet\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-multus-certs\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-iptables-alerter-script\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793889 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-config\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-system-cni-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-netd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793960 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-run-multus-certs\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-socket-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.793992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysconfig\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-cni-multus\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-host\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-slash\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-host-var-lib-kubelet\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-netns\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794096 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-run-netns\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.795342 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-system-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-os-release\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-slash\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794220 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-run\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-sysconfig\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794252 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-bin\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794261 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-run\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d075efdc-d5f5-490a-a543-09e52a1f9e38-host\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-systemd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-os-release\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-node-log\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-node-log\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794387 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-modprobe-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794152 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-system-cni-dir\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-systemd\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.796299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-systemd-units\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794453 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-systemd\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-systemd-units\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794500 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-host\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-socket-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-iptables-alerter-script\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-system-cni-dir\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/daf221a4-075f-4ecb-83fb-afb1b4d25997-os-release\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c6e1747-9197-468a-b61e-0e687eab6eaa-konnectivity-ca\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-hosts-file\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-sys\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcnx\" (UniqueName: \"kubernetes.io/projected/ab887213-98f0-4051-a99d-d23453b1ec24-kube-api-access-7tcnx\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-sys\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-script-lib\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.797156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.794965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/83a4c543-c34e-4cca-b476-71845b4617e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-host-cni-bin\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d075efdc-d5f5-490a-a543-09e52a1f9e38-host\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/daf221a4-075f-4ecb-83fb-afb1b4d25997-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ab887213-98f0-4051-a99d-d23453b1ec24-etc-modprobe-d\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovnkube-config\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795473 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-ovn-node-metrics-cert\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.795600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-tmp\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.796598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-run-systemd\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.797948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.797125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ab887213-98f0-4051-a99d-d23453b1ec24-etc-tuned\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.803757 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.803710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" event={"ID":"514e0b046b2478b20bc5c0831a3ea228","Type":"ContainerStarted","Data":"5a410ff0b2f24accb29c4d7d5a3ea06bcf98255846a578f278574a62030ef62d"} Apr 22 15:08:39.804738 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.804713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" event={"ID":"4f00231807412dde1ca296d35e880b1b","Type":"ContainerStarted","Data":"2bf07822d6444ee5971576efe02658c9ea2186cd1a2ddab961cc10a0745f83e5"} Apr 22 15:08:39.815717 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.815694 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgvs4\" (UniqueName: \"kubernetes.io/projected/d075efdc-d5f5-490a-a543-09e52a1f9e38-kube-api-access-qgvs4\") pod \"node-ca-vrpzt\" (UID: \"d075efdc-d5f5-490a-a543-09e52a1f9e38\") " pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:39.818303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.818280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thk7t\" (UniqueName: \"kubernetes.io/projected/5fcf9aef-8476-4cab-aa68-0f61db3e03f3-kube-api-access-thk7t\") pod \"iptables-alerter-p49tx\" (UID: \"5fcf9aef-8476-4cab-aa68-0f61db3e03f3\") " pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.823290 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.823249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcnx\" (UniqueName: \"kubernetes.io/projected/ab887213-98f0-4051-a99d-d23453b1ec24-kube-api-access-7tcnx\") pod \"tuned-s5jjg\" (UID: \"ab887213-98f0-4051-a99d-d23453b1ec24\") " pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:39.823774 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.823748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx2l5\" (UniqueName: \"kubernetes.io/projected/83a4c543-c34e-4cca-b476-71845b4617e3-kube-api-access-wx2l5\") pod \"aws-ebs-csi-driver-node-zhm4z\" (UID: \"83a4c543-c34e-4cca-b476-71845b4617e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:39.823934 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.823913 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsnhd\" (UniqueName: \"kubernetes.io/projected/daf221a4-075f-4ecb-83fb-afb1b4d25997-kube-api-access-dsnhd\") pod \"multus-additional-cni-plugins-sg7kx\" (UID: \"daf221a4-075f-4ecb-83fb-afb1b4d25997\") " pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:39.824288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.824260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d525\" (UniqueName: \"kubernetes.io/projected/82f59d25-cb9b-4bfc-a131-c631b53ef9c3-kube-api-access-5d525\") pod \"ovnkube-node-42xf8\" (UID: \"82f59d25-cb9b-4bfc-a131-c631b53ef9c3\") " pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:39.825923 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.825366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmfm\" (UniqueName: \"kubernetes.io/projected/4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177-kube-api-access-4vmfm\") pod \"multus-94cqk\" (UID: \"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177\") " pod="openshift-multus/multus-94cqk" Apr 22 15:08:39.825923 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.825635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2djj\" (UniqueName: \"kubernetes.io/projected/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-kube-api-access-k2djj\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:39.895916 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.895886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c6e1747-9197-468a-b61e-0e687eab6eaa-konnectivity-ca\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.895916 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.895920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-hosts-file\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.896122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.895968 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c6e1747-9197-468a-b61e-0e687eab6eaa-agent-certs\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.896122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896001 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-tmp-dir\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.896122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhnz6\" (UniqueName: \"kubernetes.io/projected/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-kube-api-access-mhnz6\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.896290 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:39.896368 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-tmp-dir\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.896368 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896352 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-hosts-file\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.896523 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.896499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c6e1747-9197-468a-b61e-0e687eab6eaa-konnectivity-ca\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.898842 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.898820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c6e1747-9197-468a-b61e-0e687eab6eaa-agent-certs\") pod \"konnectivity-agent-p49zs\" (UID: \"7c6e1747-9197-468a-b61e-0e687eab6eaa\") " pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:39.907560 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.907541 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:39.907668 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.907564 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:39.907668 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.907577 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:39.907668 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:39.907641 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:08:40.407624078 +0000 UTC m=+3.194261133 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:39.909940 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.909916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhnz6\" (UniqueName: \"kubernetes.io/projected/031a7138-6b28-4cf1-9f28-ca9c3f9e3225-kube-api-access-mhnz6\") pod \"node-resolver-4sxvb\" (UID: \"031a7138-6b28-4cf1-9f28-ca9c3f9e3225\") " pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:39.976155 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.976079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p49tx" Apr 22 15:08:39.990645 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:39.990622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" Apr 22 15:08:40.001303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.001277 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" Apr 22 15:08:40.007954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.007936 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-94cqk" Apr 22 15:08:40.016627 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.016604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:08:40.024261 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.024242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vrpzt" Apr 22 15:08:40.032847 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.032829 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:08:40.040364 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.040343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4sxvb" Apr 22 15:08:40.046929 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.046907 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" Apr 22 15:08:40.299108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.299018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:40.299293 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.299138 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:40.299293 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.299222 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:08:41.299205657 +0000 UTC m=+4.085842709 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:40.500482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.500441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:40.500653 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.500629 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:40.500787 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.500658 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:40.500787 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.500672 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:40.500787 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:40.500731 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:08:41.50071715 +0000 UTC m=+4.287354195 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:40.590570 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.590546 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd075efdc_d5f5_490a_a543_09e52a1f9e38.slice/crio-b369ed9d12de8dc75c8c4349fad6a8cceda994a6dcb07d4dda36a103eb0c3d4e WatchSource:0}: Error finding container b369ed9d12de8dc75c8c4349fad6a8cceda994a6dcb07d4dda36a103eb0c3d4e: Status 404 returned error can't find the container with id b369ed9d12de8dc75c8c4349fad6a8cceda994a6dcb07d4dda36a103eb0c3d4e Apr 22 15:08:40.602335 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.602109 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab887213_98f0_4051_a99d_d23453b1ec24.slice/crio-bd65d94cd8205beff6862aba88cb9bae8761a3bf540990d716475e8350494e5d WatchSource:0}: Error finding container bd65d94cd8205beff6862aba88cb9bae8761a3bf540990d716475e8350494e5d: Status 404 returned error can't find the container with id bd65d94cd8205beff6862aba88cb9bae8761a3bf540990d716475e8350494e5d Apr 22 15:08:40.603713 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.603690 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a4c543_c34e_4cca_b476_71845b4617e3.slice/crio-7371fceb2d64dafa45303d2799bcf22c51ef142772058f192f139e29dc9f0b7a WatchSource:0}: Error finding container 7371fceb2d64dafa45303d2799bcf22c51ef142772058f192f139e29dc9f0b7a: Status 404 returned error can't find the container with id 7371fceb2d64dafa45303d2799bcf22c51ef142772058f192f139e29dc9f0b7a Apr 22 15:08:40.608314 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.608289 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bd5d6e7_5c42_493b_9b0c_b9dd5bf3d177.slice/crio-2eb7f13b08124d3abfcdb6f1daabae6e69305a57445f095454f50c6fccd5bbdd WatchSource:0}: Error finding container 2eb7f13b08124d3abfcdb6f1daabae6e69305a57445f095454f50c6fccd5bbdd: Status 404 returned error can't find the container with id 2eb7f13b08124d3abfcdb6f1daabae6e69305a57445f095454f50c6fccd5bbdd Apr 22 15:08:40.608992 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.608965 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c6e1747_9197_468a_b61e_0e687eab6eaa.slice/crio-1e150ece773c230ecd1a8a8a12f8fbe1ad3d75e3d01aacce231364c8384f10fa WatchSource:0}: Error finding container 1e150ece773c230ecd1a8a8a12f8fbe1ad3d75e3d01aacce231364c8384f10fa: Status 404 returned error can't find the container with id 1e150ece773c230ecd1a8a8a12f8fbe1ad3d75e3d01aacce231364c8384f10fa Apr 22 15:08:40.609586 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.609554 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaf221a4_075f_4ecb_83fb_afb1b4d25997.slice/crio-5c44d9955e89f5b915f24be3beddacd9b41615cbe8e47b7d601d38276d709957 WatchSource:0}: Error finding container 5c44d9955e89f5b915f24be3beddacd9b41615cbe8e47b7d601d38276d709957: Status 404 returned error can't find the container with id 5c44d9955e89f5b915f24be3beddacd9b41615cbe8e47b7d601d38276d709957 Apr 22 15:08:40.610429 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.610407 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82f59d25_cb9b_4bfc_a131_c631b53ef9c3.slice/crio-1359a1fd9cd6cd7c37a29aa5bbef982cf02932c2dff00192f62628a6163d5df2 WatchSource:0}: Error finding container 1359a1fd9cd6cd7c37a29aa5bbef982cf02932c2dff00192f62628a6163d5df2: Status 404 returned error can't find the container with id 1359a1fd9cd6cd7c37a29aa5bbef982cf02932c2dff00192f62628a6163d5df2 Apr 22 15:08:40.611549 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.611367 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod031a7138_6b28_4cf1_9f28_ca9c3f9e3225.slice/crio-e4172698541827bf504f51d739536dc49b48a2ff7479878c9c3f6f0338a84c4d WatchSource:0}: Error finding container e4172698541827bf504f51d739536dc49b48a2ff7479878c9c3f6f0338a84c4d: Status 404 returned error can't find the container with id e4172698541827bf504f51d739536dc49b48a2ff7479878c9c3f6f0338a84c4d Apr 22 15:08:40.613516 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:08:40.613224 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fcf9aef_8476_4cab_aa68_0f61db3e03f3.slice/crio-801893ceb617db864f0bb64a7f193c27fba9cc98776134a7cfef2a8dc1732d04 WatchSource:0}: Error finding container 801893ceb617db864f0bb64a7f193c27fba9cc98776134a7cfef2a8dc1732d04: Status 404 returned error can't find the container with id 801893ceb617db864f0bb64a7f193c27fba9cc98776134a7cfef2a8dc1732d04 Apr 22 15:08:40.725072 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.725037 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:03:38 +0000 UTC" deadline="2028-01-10 22:24:18.042467282 +0000 UTC" Apr 22 15:08:40.725072 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.725066 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15079h15m37.317404827s" Apr 22 15:08:40.807306 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.807275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" event={"ID":"ab887213-98f0-4051-a99d-d23453b1ec24","Type":"ContainerStarted","Data":"bd65d94cd8205beff6862aba88cb9bae8761a3bf540990d716475e8350494e5d"} Apr 22 15:08:40.808379 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.808356 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vrpzt" event={"ID":"d075efdc-d5f5-490a-a543-09e52a1f9e38","Type":"ContainerStarted","Data":"b369ed9d12de8dc75c8c4349fad6a8cceda994a6dcb07d4dda36a103eb0c3d4e"} Apr 22 15:08:40.809409 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.809384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"1359a1fd9cd6cd7c37a29aa5bbef982cf02932c2dff00192f62628a6163d5df2"} Apr 22 15:08:40.810336 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.810310 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerStarted","Data":"5c44d9955e89f5b915f24be3beddacd9b41615cbe8e47b7d601d38276d709957"} Apr 22 15:08:40.811245 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.811221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-94cqk" event={"ID":"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177","Type":"ContainerStarted","Data":"2eb7f13b08124d3abfcdb6f1daabae6e69305a57445f095454f50c6fccd5bbdd"} Apr 22 15:08:40.812676 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.812656 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" event={"ID":"4f00231807412dde1ca296d35e880b1b","Type":"ContainerStarted","Data":"b6fd8998f666a97b12ad5e8cf58dca11617c5ee8853545c9640cafd9d3613d83"} Apr 22 15:08:40.814228 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.814185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p49tx" event={"ID":"5fcf9aef-8476-4cab-aa68-0f61db3e03f3","Type":"ContainerStarted","Data":"801893ceb617db864f0bb64a7f193c27fba9cc98776134a7cfef2a8dc1732d04"} Apr 22 15:08:40.815586 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.815565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4sxvb" event={"ID":"031a7138-6b28-4cf1-9f28-ca9c3f9e3225","Type":"ContainerStarted","Data":"e4172698541827bf504f51d739536dc49b48a2ff7479878c9c3f6f0338a84c4d"} Apr 22 15:08:40.816774 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.816752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p49zs" event={"ID":"7c6e1747-9197-468a-b61e-0e687eab6eaa","Type":"ContainerStarted","Data":"1e150ece773c230ecd1a8a8a12f8fbe1ad3d75e3d01aacce231364c8384f10fa"} Apr 22 15:08:40.817522 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.817506 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" event={"ID":"83a4c543-c34e-4cca-b476-71845b4617e3","Type":"ContainerStarted","Data":"7371fceb2d64dafa45303d2799bcf22c51ef142772058f192f139e29dc9f0b7a"} Apr 22 15:08:40.827059 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:40.827023 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-228.ec2.internal" podStartSLOduration=2.827008684 podStartE2EDuration="2.827008684s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:40.825954884 +0000 UTC m=+3.612591944" watchObservedRunningTime="2026-04-22 15:08:40.827008684 +0000 UTC m=+3.613645745" Apr 22 15:08:41.305770 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.305733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:41.305958 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.305898 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:41.306028 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.305958 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:08:43.305940542 +0000 UTC m=+6.092577583 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:41.507230 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.507089 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:41.507386 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.507288 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:41.507386 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.507305 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:41.507386 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.507317 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:41.507386 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.507370 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:08:43.507353487 +0000 UTC m=+6.293990527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:41.800444 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.800361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:41.800856 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.800490 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:41.800856 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.800828 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:41.800971 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:41.800936 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:41.827000 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.826966 2577 generic.go:358] "Generic (PLEG): container finished" podID="514e0b046b2478b20bc5c0831a3ea228" containerID="52f38bf1945e3609a404bd50f28e1af2e94c49e4fc136fef3a81a28944128f74" exitCode=0 Apr 22 15:08:41.827255 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:41.827229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" event={"ID":"514e0b046b2478b20bc5c0831a3ea228","Type":"ContainerDied","Data":"52f38bf1945e3609a404bd50f28e1af2e94c49e4fc136fef3a81a28944128f74"} Apr 22 15:08:42.839295 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:42.839257 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" event={"ID":"514e0b046b2478b20bc5c0831a3ea228","Type":"ContainerStarted","Data":"056712ad54f9ef2df4fbceba6d34a476fa5912f74fa100a253150265888607f9"} Apr 22 15:08:42.884008 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:42.883404 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-228.ec2.internal" podStartSLOduration=4.88338383 podStartE2EDuration="4.88338383s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:08:42.88326107 +0000 UTC m=+5.669898129" watchObservedRunningTime="2026-04-22 15:08:42.88338383 +0000 UTC m=+5.670020891" Apr 22 15:08:43.327725 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:43.327620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:43.327881 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.327777 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:43.327881 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.327852 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:08:47.327831109 +0000 UTC m=+10.114468153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:43.529485 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:43.529450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:43.529633 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.529595 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:43.529633 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.529613 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:43.529633 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.529624 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:43.529758 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.529679 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:08:47.529661243 +0000 UTC m=+10.316298296 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:43.800422 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:43.800319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:43.800584 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.800463 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:43.800862 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:43.800837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:43.800957 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:43.800933 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:45.800014 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:45.799820 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:45.800014 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:45.799958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:45.800541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:45.800515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:45.800648 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:45.800615 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:47.127349 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.127313 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-t86n6"] Apr 22 15:08:47.129853 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.129411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.129853 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.129494 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:47.184366 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.184321 2577 status_manager.go:895] "Failed to get status for pod" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" pod="kube-system/global-pull-secret-syncer-t86n6" err="pods \"global-pull-secret-syncer-t86n6\" is forbidden: User \"system:node:ip-10-0-137-228.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ip-10-0-137-228.ec2.internal' and this object" Apr 22 15:08:47.259170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.259137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-kubelet-config\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.259170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.259184 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.259436 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.259232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-dbus\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.359722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-dbus\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.359796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.359836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-kubelet-config\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.359865 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.359982 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.360043 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:08:47.860024808 +0000 UTC m=+10.646661863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.360395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-dbus\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.360490 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.360533 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:08:55.360518031 +0000 UTC m=+18.147155087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:47.360569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.360531 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-kubelet-config\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.561009 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.560879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:47.561182 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.561085 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:47.561182 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.561114 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:47.561182 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.561130 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:47.561803 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.561207 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:08:55.561172985 +0000 UTC m=+18.347810046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:47.800998 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.800769 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:47.800998 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.800903 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:47.801257 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.801225 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:47.801350 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.801314 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:47.864085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:47.864007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:47.864262 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.864186 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:47.864339 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:47.864264 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:08:48.864245604 +0000 UTC m=+11.650882655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:48.799573 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:48.799537 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:48.799952 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:48.799675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:48.872383 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:48.872297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:48.872547 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:48.872438 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:48.872547 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:48.872521 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:08:50.87249928 +0000 UTC m=+13.659136330 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:49.799156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:49.799076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:49.799326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:49.799076 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:49.799326 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:49.799226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:49.799326 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:49.799302 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:50.800009 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:50.799974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:50.800480 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:50.800098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:50.887522 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:50.887489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:50.887679 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:50.887646 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:50.887736 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:50.887717 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:08:54.887697893 +0000 UTC m=+17.674334943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:51.799312 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:51.799228 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:51.799497 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:51.799370 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:51.799497 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:51.799437 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:51.799607 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:51.799546 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:52.799359 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:52.799330 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:52.799696 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:52.799426 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:53.799561 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:53.799530 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:53.800048 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:53.799540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:53.800048 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:53.799641 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:53.800048 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:53.799744 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:54.799635 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:54.799599 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:54.800038 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:54.799734 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:54.919713 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:54.919681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:54.919894 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:54.919823 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:54.919894 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:54.919887 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:09:02.919873352 +0000 UTC m=+25.706510405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:08:55.423541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:55.423504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:55.423720 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.423651 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:55.423781 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.423722 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:09:11.42370261 +0000 UTC m=+34.210339666 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:08:55.624487 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:55.624454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:55.624624 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.624576 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:08:55.624624 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.624588 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:08:55.624624 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.624597 2577 projected.go:194] Error preparing data for projected volume kube-api-access-fh8bz for pod openshift-network-diagnostics/network-check-target-bvrrk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:55.624735 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.624639 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz podName:c7876708-f581-4c0c-becb-c7c90e442cda nodeName:}" failed. No retries permitted until 2026-04-22 15:09:11.62462678 +0000 UTC m=+34.411263818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fh8bz" (UniqueName: "kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz") pod "network-check-target-bvrrk" (UID: "c7876708-f581-4c0c-becb-c7c90e442cda") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:08:55.799784 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:55.799748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:55.800263 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:55.799786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:55.800263 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.799882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:55.800263 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:55.800023 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:56.799509 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:56.799470 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:56.799687 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:56.799609 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:57.799745 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.799560 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:57.800437 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:57.799818 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:57.800437 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.799610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:57.800437 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:57.799896 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:57.864113 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.864084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerStarted","Data":"cfa33390d9e85aabcf2b83463ee29290c40c0a45c323574db466cd08b991f5aa"} Apr 22 15:08:57.865524 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.865494 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-94cqk" event={"ID":"4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177","Type":"ContainerStarted","Data":"ee38352c63bb0ce6f964cd0c7b35cfb1c20c1a3f8908c96ffa73a19f681a5cc5"} Apr 22 15:08:57.866740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.866713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4sxvb" event={"ID":"031a7138-6b28-4cf1-9f28-ca9c3f9e3225","Type":"ContainerStarted","Data":"aedfe6729bbc1fcd5555d5a93acc6bb887570250987da8a440b390e045e779e3"} Apr 22 15:08:57.867826 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.867805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p49zs" event={"ID":"7c6e1747-9197-468a-b61e-0e687eab6eaa","Type":"ContainerStarted","Data":"3ed8c242d18f5fe9899709f0fad48ad18c4d4de8293973dc5ec17cbce8a70f2b"} Apr 22 15:08:57.868916 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.868899 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" event={"ID":"83a4c543-c34e-4cca-b476-71845b4617e3","Type":"ContainerStarted","Data":"b92c44fff685f4b47d185be2db1796ce0a26e42219e26aa5b75204585a6382e3"} Apr 22 15:08:57.869998 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.869980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" event={"ID":"ab887213-98f0-4051-a99d-d23453b1ec24","Type":"ContainerStarted","Data":"0d612600ad7556e17e8b7313611d530895edb27b11e927a0d92e9d52c67858a7"} Apr 22 15:08:57.872969 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.872938 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vrpzt" event={"ID":"d075efdc-d5f5-490a-a543-09e52a1f9e38","Type":"ContainerStarted","Data":"4ba37566a4f4d13666b7d095c01cfe494058785323e09ea68b53fedb428581e1"} Apr 22 15:08:57.874302 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.874283 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"7949110f76e2cee6fd0e05cfc77eb5af7af24389fd71f8d51b2b3a01123f029f"} Apr 22 15:08:57.915609 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.915566 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-94cqk" podStartSLOduration=4.03859613 podStartE2EDuration="20.915553752s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.610016018 +0000 UTC m=+3.396653062" lastFinishedPulling="2026-04-22 15:08:57.486973632 +0000 UTC m=+20.273610684" observedRunningTime="2026-04-22 15:08:57.915347934 +0000 UTC m=+20.701984993" watchObservedRunningTime="2026-04-22 15:08:57.915553752 +0000 UTC m=+20.702190812" Apr 22 15:08:57.932333 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.932299 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vrpzt" podStartSLOduration=8.723314158 podStartE2EDuration="20.932285308s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.601267934 +0000 UTC m=+3.387904972" lastFinishedPulling="2026-04-22 15:08:52.81023907 +0000 UTC m=+15.596876122" observedRunningTime="2026-04-22 15:08:57.932205334 +0000 UTC m=+20.718842390" watchObservedRunningTime="2026-04-22 15:08:57.932285308 +0000 UTC m=+20.718922369" Apr 22 15:08:57.950793 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.950755 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4sxvb" podStartSLOduration=3.114849149 podStartE2EDuration="19.950743635s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.613411971 +0000 UTC m=+3.400049013" lastFinishedPulling="2026-04-22 15:08:57.449306457 +0000 UTC m=+20.235943499" observedRunningTime="2026-04-22 15:08:57.950686878 +0000 UTC m=+20.737323938" watchObservedRunningTime="2026-04-22 15:08:57.950743635 +0000 UTC m=+20.737380694" Apr 22 15:08:57.967850 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.967796 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s5jjg" podStartSLOduration=4.123485567 podStartE2EDuration="20.967777762s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.605012891 +0000 UTC m=+3.391649934" lastFinishedPulling="2026-04-22 15:08:57.449305079 +0000 UTC m=+20.235942129" observedRunningTime="2026-04-22 15:08:57.967640474 +0000 UTC m=+20.754277533" watchObservedRunningTime="2026-04-22 15:08:57.967777762 +0000 UTC m=+20.754414823" Apr 22 15:08:57.983448 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:57.983407 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p49zs" podStartSLOduration=3.467529798 podStartE2EDuration="19.983391671s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.610983319 +0000 UTC m=+3.397620360" lastFinishedPulling="2026-04-22 15:08:57.126845192 +0000 UTC m=+19.913482233" observedRunningTime="2026-04-22 15:08:57.982925907 +0000 UTC m=+20.769562968" watchObservedRunningTime="2026-04-22 15:08:57.983391671 +0000 UTC m=+20.770028731" Apr 22 15:08:58.799489 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.799464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:08:58.799594 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:58.799561 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:08:58.876556 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.876524 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p49tx" event={"ID":"5fcf9aef-8476-4cab-aa68-0f61db3e03f3","Type":"ContainerStarted","Data":"d683ab2d8b4c5f5680bc1f6ec061d670ba70491559284705ad73da0a0a63699a"} Apr 22 15:08:58.878803 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.878782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:08:58.879068 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879048 2577 generic.go:358] "Generic (PLEG): container finished" podID="82f59d25-cb9b-4bfc-a131-c631b53ef9c3" containerID="fbb3a9229e2c8e9d754197321b14c85e800d9088136d76e82938e90023a926b2" exitCode=1 Apr 22 15:08:58.879134 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerDied","Data":"fbb3a9229e2c8e9d754197321b14c85e800d9088136d76e82938e90023a926b2"} Apr 22 15:08:58.879180 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"0b438731f9ddba1011ca79fa2e953b96677d79f6d2cce17e2fb0c2ceb7b34974"} Apr 22 15:08:58.879180 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879155 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"2483b5b226e81fc2163d64277284609f71aa3d9cbce972664ca6cc47597fad40"} Apr 22 15:08:58.879180 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879168 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"a40c752c75d8a7eb4f4d48238481b85fa0479504a144d435f1ff4fe81c107e4f"} Apr 22 15:08:58.879306 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.879181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"dcef1c0b0385626b02db01282ac6c978a86d40b09b1081235e33a342ba728500"} Apr 22 15:08:58.880289 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.880265 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="cfa33390d9e85aabcf2b83463ee29290c40c0a45c323574db466cd08b991f5aa" exitCode=0 Apr 22 15:08:58.880388 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.880307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"cfa33390d9e85aabcf2b83463ee29290c40c0a45c323574db466cd08b991f5aa"} Apr 22 15:08:58.942754 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:58.942664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p49tx" podStartSLOduration=5.107934585 podStartE2EDuration="21.942649842s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.614878132 +0000 UTC m=+3.401515173" lastFinishedPulling="2026-04-22 15:08:57.449593386 +0000 UTC m=+20.236230430" observedRunningTime="2026-04-22 15:08:58.904345368 +0000 UTC m=+21.690982428" watchObservedRunningTime="2026-04-22 15:08:58.942649842 +0000 UTC m=+21.729286909" Apr 22 15:08:59.136258 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.136105 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:08:59.737690 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.737579 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:08:59.136253204Z","UUID":"2dfdc406-63b1-4a2f-9282-f6e560db3d85","Handler":null,"Name":"","Endpoint":""} Apr 22 15:08:59.740866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.740836 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:08:59.740866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.740871 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:08:59.799644 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.799615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:08:59.799767 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.799680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:08:59.799826 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:59.799804 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:08:59.799980 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:08:59.799946 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:08:59.884258 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:08:59.884220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" event={"ID":"83a4c543-c34e-4cca-b476-71845b4617e3","Type":"ContainerStarted","Data":"1bfc0778857b387618f9771b6abdf38b0460242b38193f3701d5c638c8b01747"} Apr 22 15:09:00.188033 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.188002 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:09:00.188678 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.188663 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:09:00.802083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.800004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:00.802083 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:00.800172 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:00.888315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.888278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" event={"ID":"83a4c543-c34e-4cca-b476-71845b4617e3","Type":"ContainerStarted","Data":"749a7934461d35bd6564f3dd7cb72971e18d06e48444f7c4bd18ef7d4edd013c"} Apr 22 15:09:00.891049 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.891024 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:09:00.891383 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.891359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"25a2bc8a86a33b769f1d6cfda95c13ad14ab171453d752da84a72e584cb02df4"} Apr 22 15:09:00.891590 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.891567 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:09:00.892100 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.892071 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p49zs" Apr 22 15:09:00.909427 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:00.909380 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zhm4z" podStartSLOduration=4.074491026 podStartE2EDuration="23.909368562s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.605739062 +0000 UTC m=+3.392376106" lastFinishedPulling="2026-04-22 15:09:00.440616589 +0000 UTC m=+23.227253642" observedRunningTime="2026-04-22 15:09:00.908954107 +0000 UTC m=+23.695591168" watchObservedRunningTime="2026-04-22 15:09:00.909368562 +0000 UTC m=+23.696005622" Apr 22 15:09:01.799269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:01.799227 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:01.799441 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:01.799350 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:09:01.799441 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:01.799388 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:01.799528 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:01.799460 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:09:02.799580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:02.799547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:02.800021 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:02.799677 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:02.985023 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:02.984953 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:02.985174 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:02.985138 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 15:09:02.985236 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:02.985228 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret podName:8348847f-4e3a-43f6-bbab-5b6d67eff9fd nodeName:}" failed. No retries permitted until 2026-04-22 15:09:18.985207066 +0000 UTC m=+41.771844118 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret") pod "global-pull-secret-syncer-t86n6" (UID: "8348847f-4e3a-43f6-bbab-5b6d67eff9fd") : object "kube-system"/"original-pull-secret" not registered Apr 22 15:09:03.799345 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.799317 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:03.799435 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.799346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:03.799478 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:03.799458 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:09:03.799575 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:03.799557 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:09:03.900084 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.900059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:09:03.900570 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.900395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"4c1130103c25e51b9351184c4f2a5c3c967242c0659e1a65477309f1f6330cad"} Apr 22 15:09:03.900813 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.900795 2577 scope.go:117] "RemoveContainer" containerID="fbb3a9229e2c8e9d754197321b14c85e800d9088136d76e82938e90023a926b2" Apr 22 15:09:03.900899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.900797 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:03.900943 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.900917 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:03.902168 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.902136 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="bde371bb04ff1ce7d60161da015afc1661d1f578dfd1d5b7d202297f3d1160e6" exitCode=0 Apr 22 15:09:03.902273 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.902170 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"bde371bb04ff1ce7d60161da015afc1661d1f578dfd1d5b7d202297f3d1160e6"} Apr 22 15:09:03.916219 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:03.916186 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:04.799976 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.799944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:04.800120 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:04.800057 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:04.907233 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.907209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:09:04.907616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.907522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" event={"ID":"82f59d25-cb9b-4bfc-a131-c631b53ef9c3","Type":"ContainerStarted","Data":"3f3b7f0cdd8f821a22c58f66816985d76d904684c494256453ab660d01ba1bb9"} Apr 22 15:09:04.907820 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.907793 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:04.921750 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.921728 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:04.941812 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:04.941768 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" podStartSLOduration=11.014108697 podStartE2EDuration="27.941752611s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.61245168 +0000 UTC m=+3.399088732" lastFinishedPulling="2026-04-22 15:08:57.540095604 +0000 UTC m=+20.326732646" observedRunningTime="2026-04-22 15:09:04.941272677 +0000 UTC m=+27.727909737" watchObservedRunningTime="2026-04-22 15:09:04.941752611 +0000 UTC m=+27.728389704" Apr 22 15:09:05.300464 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.300277 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9nk69"] Apr 22 15:09:05.300628 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.300559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:05.300708 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:05.300685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:09:05.306775 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.304611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvrrk"] Apr 22 15:09:05.306775 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.304736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:05.306775 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:05.304845 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:09:05.318899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.318871 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t86n6"] Apr 22 15:09:05.319013 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.318962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:05.319058 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:05.319035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:05.910941 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.910908 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="8994a76d788875f75b9f7ecd5557a4e1a26598c9b11ec68c5f4be281a3a9df8b" exitCode=0 Apr 22 15:09:05.911377 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:05.910987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"8994a76d788875f75b9f7ecd5557a4e1a26598c9b11ec68c5f4be281a3a9df8b"} Apr 22 15:09:06.799139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:06.799110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:06.799285 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:06.799118 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:06.799285 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:06.799216 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:09:06.799353 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:06.799283 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:09:07.800208 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:07.800165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:07.800755 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:07.800257 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:07.916605 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:07.916554 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="39c76b449ca3b191baf6cee1d084ea902c026c83aa87188b614538267255e538" exitCode=0 Apr 22 15:09:07.916766 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:07.916645 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"39c76b449ca3b191baf6cee1d084ea902c026c83aa87188b614538267255e538"} Apr 22 15:09:08.799154 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:08.799125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:08.799353 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:08.799127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:08.799353 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:08.799271 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:09:08.799470 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:08.799342 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bvrrk" podUID="c7876708-f581-4c0c-becb-c7c90e442cda" Apr 22 15:09:09.799356 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:09.799319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:09.799757 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:09.799452 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-t86n6" podUID="8348847f-4e3a-43f6-bbab-5b6d67eff9fd" Apr 22 15:09:10.525457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.525431 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-228.ec2.internal" event="NodeReady" Apr 22 15:09:10.525623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.525557 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:09:10.572766 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.572734 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9thxk"] Apr 22 15:09:10.589780 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.589735 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nlf5r"] Apr 22 15:09:10.589938 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.589869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.592301 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.592159 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkblf\"" Apr 22 15:09:10.592688 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.592530 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:09:10.592688 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.592570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:09:10.605609 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.605587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9thxk"] Apr 22 15:09:10.605735 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.605617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nlf5r"] Apr 22 15:09:10.605735 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.605732 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:10.608623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.608466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:09:10.608623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.608563 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cbqpc\"" Apr 22 15:09:10.608785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.608727 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:09:10.609026 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.609010 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:09:10.747172 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99b60141-c5a1-4685-b0c9-f59380bb89b8-config-volume\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.747381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.747381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dlck\" (UniqueName: \"kubernetes.io/projected/99b60141-c5a1-4685-b0c9-f59380bb89b8-kube-api-access-4dlck\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.747381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:10.747381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnjwb\" (UniqueName: \"kubernetes.io/projected/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-kube-api-access-gnjwb\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:10.747381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.747374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b60141-c5a1-4685-b0c9-f59380bb89b8-tmp-dir\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.799378 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.799308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:10.799876 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.799322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:10.802043 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.802020 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:09:10.802173 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.802098 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-62z2n\"" Apr 22 15:09:10.802661 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.802639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:09:10.802768 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.802738 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4nbn2\"" Apr 22 15:09:10.802832 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.802782 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:09:10.848611 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99b60141-c5a1-4685-b0c9-f59380bb89b8-config-volume\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.848737 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.848737 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dlck\" (UniqueName: \"kubernetes.io/projected/99b60141-c5a1-4685-b0c9-f59380bb89b8-kube-api-access-4dlck\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.848737 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:10.848737 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnjwb\" (UniqueName: \"kubernetes.io/projected/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-kube-api-access-gnjwb\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:10.848924 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.848745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b60141-c5a1-4685-b0c9-f59380bb89b8-tmp-dir\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.848924 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:10.848784 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:10.848924 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:10.848858 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:11.348839638 +0000 UTC m=+34.135476678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:10.848924 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:10.848873 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:10.849119 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:10.848934 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:11.348922435 +0000 UTC m=+34.135559480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:10.849280 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.849263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99b60141-c5a1-4685-b0c9-f59380bb89b8-tmp-dir\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.849468 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.849451 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99b60141-c5a1-4685-b0c9-f59380bb89b8-config-volume\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.862660 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.862639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dlck\" (UniqueName: \"kubernetes.io/projected/99b60141-c5a1-4685-b0c9-f59380bb89b8-kube-api-access-4dlck\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:10.862802 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:10.862784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnjwb\" (UniqueName: \"kubernetes.io/projected/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-kube-api-access-gnjwb\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:11.353682 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.353647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:11.353682 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.353689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:11.353908 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.353793 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:11.353908 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.353853 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:12.35383895 +0000 UTC m=+35.140475989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:11.353908 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.353793 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:11.354018 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.353971 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:12.353950166 +0000 UTC m=+35.140587220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:11.454213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.454163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:11.454397 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.454326 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:09:11.454451 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:11.454406 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:09:43.454386155 +0000 UTC m=+66.241023205 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : secret "metrics-daemon-secret" not found Apr 22 15:09:11.656587 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.656512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:11.658919 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.658892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8bz\" (UniqueName: \"kubernetes.io/projected/c7876708-f581-4c0c-becb-c7c90e442cda-kube-api-access-fh8bz\") pod \"network-check-target-bvrrk\" (UID: \"c7876708-f581-4c0c-becb-c7c90e442cda\") " pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:11.710875 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.710843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:11.799931 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.799763 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:11.805795 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.805773 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:09:11.875804 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.875762 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bvrrk"] Apr 22 15:09:11.880214 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:09:11.880169 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7876708_f581_4c0c_becb_c7c90e442cda.slice/crio-1cda31a9b63070d3b9b53508720f1e52e73b6ab89524545edffbc8a1baecb631 WatchSource:0}: Error finding container 1cda31a9b63070d3b9b53508720f1e52e73b6ab89524545edffbc8a1baecb631: Status 404 returned error can't find the container with id 1cda31a9b63070d3b9b53508720f1e52e73b6ab89524545edffbc8a1baecb631 Apr 22 15:09:11.924875 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:11.924779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvrrk" event={"ID":"c7876708-f581-4c0c-becb-c7c90e442cda","Type":"ContainerStarted","Data":"1cda31a9b63070d3b9b53508720f1e52e73b6ab89524545edffbc8a1baecb631"} Apr 22 15:09:12.360784 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:12.360750 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:12.361096 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:12.360793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:12.361096 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:12.360915 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:12.361096 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:12.360918 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:12.361096 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:12.360972 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:14.360957653 +0000 UTC m=+37.147594696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:12.361096 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:12.361059 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:14.361043625 +0000 UTC m=+37.147680670 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:14.377475 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:14.377441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:14.377833 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:14.377481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:14.377833 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:14.377595 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:14.377833 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:14.377650 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:18.377638214 +0000 UTC m=+41.164275252 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:14.377833 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:14.377595 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:14.377833 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:14.377752 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:18.37773299 +0000 UTC m=+41.164370048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:14.933537 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:14.933457 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="b38acecc9b36e84e6c4ad6b2ff5533f53493e202bfb0250a708975dd62b452cb" exitCode=0 Apr 22 15:09:14.933537 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:14.933504 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"b38acecc9b36e84e6c4ad6b2ff5533f53493e202bfb0250a708975dd62b452cb"} Apr 22 15:09:15.938753 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:15.938717 2577 generic.go:358] "Generic (PLEG): container finished" podID="daf221a4-075f-4ecb-83fb-afb1b4d25997" containerID="84d575bac5df32801d86f92bb921b35bd279e5b951508c64d10c95b2d8929154" exitCode=0 Apr 22 15:09:15.939185 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:15.938779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerDied","Data":"84d575bac5df32801d86f92bb921b35bd279e5b951508c64d10c95b2d8929154"} Apr 22 15:09:16.943627 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:16.943457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" event={"ID":"daf221a4-075f-4ecb-83fb-afb1b4d25997","Type":"ContainerStarted","Data":"902eab50d48a61aaba0df5181722fd51f89d8c832890bf800800f6e0a443be42"} Apr 22 15:09:16.971901 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:16.970443 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sg7kx" podStartSLOduration=5.017602319 podStartE2EDuration="38.970426097s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="2026-04-22 15:08:40.612011255 +0000 UTC m=+3.398648302" lastFinishedPulling="2026-04-22 15:09:14.564835039 +0000 UTC m=+37.351472080" observedRunningTime="2026-04-22 15:09:16.969334599 +0000 UTC m=+39.755971661" watchObservedRunningTime="2026-04-22 15:09:16.970426097 +0000 UTC m=+39.757063159" Apr 22 15:09:18.404735 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:18.404647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:18.404735 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:18.404689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:18.405153 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:18.404788 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:18.405153 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:18.404852 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:26.404835194 +0000 UTC m=+49.191472237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:18.405153 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:18.404857 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:18.405153 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:18.404911 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:26.404895196 +0000 UTC m=+49.191532237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:18.948626 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:18.948592 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bvrrk" event={"ID":"c7876708-f581-4c0c-becb-c7c90e442cda","Type":"ContainerStarted","Data":"46c2231aaca94aad0854d766500896b73946f23a9b6d4340454d89fe97bfee92"} Apr 22 15:09:18.948792 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:18.948716 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:09:18.966728 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:18.966679 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bvrrk" podStartSLOduration=34.724551899 podStartE2EDuration="40.96666622s" podCreationTimestamp="2026-04-22 15:08:38 +0000 UTC" firstStartedPulling="2026-04-22 15:09:11.882563411 +0000 UTC m=+34.669200457" lastFinishedPulling="2026-04-22 15:09:18.124677737 +0000 UTC m=+40.911314778" observedRunningTime="2026-04-22 15:09:18.966446882 +0000 UTC m=+41.753083941" watchObservedRunningTime="2026-04-22 15:09:18.96666622 +0000 UTC m=+41.753303281" Apr 22 15:09:19.008686 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:19.008661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:19.012523 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:19.012501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8348847f-4e3a-43f6-bbab-5b6d67eff9fd-original-pull-secret\") pod \"global-pull-secret-syncer-t86n6\" (UID: \"8348847f-4e3a-43f6-bbab-5b6d67eff9fd\") " pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:19.014403 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:19.014386 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-t86n6" Apr 22 15:09:19.127845 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:19.127808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-t86n6"] Apr 22 15:09:19.131351 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:09:19.131321 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8348847f_4e3a_43f6_bbab_5b6d67eff9fd.slice/crio-f922337282f0385b515d06830f4b732b63cb76d4aa96d54055a91689a4df88bf WatchSource:0}: Error finding container f922337282f0385b515d06830f4b732b63cb76d4aa96d54055a91689a4df88bf: Status 404 returned error can't find the container with id f922337282f0385b515d06830f4b732b63cb76d4aa96d54055a91689a4df88bf Apr 22 15:09:19.951280 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:19.951234 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t86n6" event={"ID":"8348847f-4e3a-43f6-bbab-5b6d67eff9fd","Type":"ContainerStarted","Data":"f922337282f0385b515d06830f4b732b63cb76d4aa96d54055a91689a4df88bf"} Apr 22 15:09:23.958749 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:23.958661 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-t86n6" event={"ID":"8348847f-4e3a-43f6-bbab-5b6d67eff9fd","Type":"ContainerStarted","Data":"f5f02fac63204f65a999bbc2ddfd0ea087846d9393ce5f4972891df4cb11d83e"} Apr 22 15:09:23.973587 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:23.973542 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-t86n6" podStartSLOduration=32.561161756 podStartE2EDuration="36.973530359s" podCreationTimestamp="2026-04-22 15:08:47 +0000 UTC" firstStartedPulling="2026-04-22 15:09:19.133092673 +0000 UTC m=+41.919729710" lastFinishedPulling="2026-04-22 15:09:23.545461275 +0000 UTC m=+46.332098313" observedRunningTime="2026-04-22 15:09:23.973472057 +0000 UTC m=+46.760109117" watchObservedRunningTime="2026-04-22 15:09:23.973530359 +0000 UTC m=+46.760167419" Apr 22 15:09:26.459260 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:26.459221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:26.459260 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:26.459263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:26.459652 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:26.459340 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:26.459652 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:26.459400 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:42.459383239 +0000 UTC m=+65.246020277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:26.459652 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:26.459344 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:26.459652 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:26.459466 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:09:42.459454641 +0000 UTC m=+65.246091678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:36.928643 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:36.928614 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-42xf8" Apr 22 15:09:42.460344 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:42.460294 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:09:42.460344 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:42.460349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:09:42.460770 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:42.460432 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:09:42.460770 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:42.460436 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:09:42.460770 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:42.460489 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:14.460473741 +0000 UTC m=+97.247110780 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:09:42.460770 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:42.460501 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:14.460495918 +0000 UTC m=+97.247132956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:09:43.467170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:43.467130 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:09:43.467555 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:43.467285 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:09:43.467555 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:09:43.467361 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.467345416 +0000 UTC m=+130.253982454 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : secret "metrics-daemon-secret" not found Apr 22 15:09:49.953319 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:09:49.953215 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bvrrk" Apr 22 15:10:14.469089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:14.469034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:10:14.469089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:14.469086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:10:14.469630 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:14.469185 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:10:14.469630 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:14.469252 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert podName:c6c9ff67-fc53-4fad-bac9-aa152e2c0640 nodeName:}" failed. No retries permitted until 2026-04-22 15:11:18.469237329 +0000 UTC m=+161.255874367 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert") pod "ingress-canary-nlf5r" (UID: "c6c9ff67-fc53-4fad-bac9-aa152e2c0640") : secret "canary-serving-cert" not found Apr 22 15:10:14.469630 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:14.469185 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:10:14.469630 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:14.469337 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls podName:99b60141-c5a1-4685-b0c9-f59380bb89b8 nodeName:}" failed. No retries permitted until 2026-04-22 15:11:18.469323137 +0000 UTC m=+161.255960179 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls") pod "dns-default-9thxk" (UID: "99b60141-c5a1-4685-b0c9-f59380bb89b8") : secret "dns-default-metrics-tls" not found Apr 22 15:10:39.657400 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.657366 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5"] Apr 22 15:10:39.659166 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.659149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" Apr 22 15:10:39.660704 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.660678 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jztj7"] Apr 22 15:10:39.662217 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.662190 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.662764 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.662745 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-fnhmb\"" Apr 22 15:10:39.662845 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.662767 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.662845 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.662809 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.664449 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.664426 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.664584 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.664433 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 15:10:39.664734 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.664718 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.664810 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.664746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qpnml\"" Apr 22 15:10:39.664810 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.664769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 15:10:39.670926 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.670905 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 15:10:39.679595 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.679575 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5"] Apr 22 15:10:39.680182 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.680164 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jztj7"] Apr 22 15:10:39.733787 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.733763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-trusted-ca\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.733911 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.733813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72vp4\" (UniqueName: \"kubernetes.io/projected/8fe0a454-c595-4c12-b2ff-afc448fddec1-kube-api-access-72vp4\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.733911 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.733831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe0a454-c595-4c12-b2ff-afc448fddec1-serving-cert\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.733986 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.733909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-489qt\" (UniqueName: \"kubernetes.io/projected/af68fd91-9826-405c-b2a7-d2ea31c49737-kube-api-access-489qt\") pod \"volume-data-source-validator-7c6cbb6c87-m7bf5\" (UID: \"af68fd91-9826-405c-b2a7-d2ea31c49737\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" Apr 22 15:10:39.733986 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.733940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-config\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.761129 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.761101 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c"] Apr 22 15:10:39.762868 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.762854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" Apr 22 15:10:39.765308 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.765273 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lm6ks\"" Apr 22 15:10:39.766141 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.766112 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn"] Apr 22 15:10:39.767708 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.767692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w75nz"] Apr 22 15:10:39.767836 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.767816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.771302 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.771278 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 15:10:39.771536 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.771513 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 15:10:39.771718 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.771698 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.771863 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.771839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-txdpp\"" Apr 22 15:10:39.772122 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.772104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.773772 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.773752 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k"] Apr 22 15:10:39.773942 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.773914 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.775484 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.775468 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:10:39.775595 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.775580 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.777017 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.777001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.788851 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.788833 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 15:10:39.789402 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.788836 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.789682 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789653 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c"] Apr 22 15:10:39.789682 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789674 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 15:10:39.789810 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789707 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:10:39.789919 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789871 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 15:10:39.789919 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789902 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.790020 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.789928 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dblpk\"" Apr 22 15:10:39.791108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.791089 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 15:10:39.797658 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.797641 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:10:39.797744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.797641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-74xx2\"" Apr 22 15:10:39.798226 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.798187 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-47lns\"" Apr 22 15:10:39.801971 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.801953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 15:10:39.802182 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.802167 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k"] Apr 22 15:10:39.802243 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.802210 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn"] Apr 22 15:10:39.802243 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.802219 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 15:10:39.802312 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.802224 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w75nz"] Apr 22 15:10:39.803271 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.803257 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 15:10:39.803525 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.803509 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 15:10:39.822470 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.822451 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 15:10:39.827419 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.827376 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:10:39.834326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834306 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9070e43c-98ec-4211-8d57-0154d4934914-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.834426 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-594tc\" (UniqueName: \"kubernetes.io/projected/9070e43c-98ec-4211-8d57-0154d4934914-kube-api-access-594tc\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.834426 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgff4\" (UniqueName: \"kubernetes.io/projected/51174689-eee9-47c4-95c1-890adba68f5a-kube-api-access-kgff4\") pod \"network-check-source-8894fc9bd-g7h5c\" (UID: \"51174689-eee9-47c4-95c1-890adba68f5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" Apr 22 15:10:39.834499 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09812955-b6a7-49e2-9f95-13ab00645d14-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.834499 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-trusted-ca\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.834499 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.834633 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.834633 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.834633 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.834752 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72vp4\" (UniqueName: \"kubernetes.io/projected/8fe0a454-c595-4c12-b2ff-afc448fddec1-kube-api-access-72vp4\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.834752 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe0a454-c595-4c12-b2ff-afc448fddec1-serving-cert\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.834752 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.834895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.834895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09812955-b6a7-49e2-9f95-13ab00645d14-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.834895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.834895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834861 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-489qt\" (UniqueName: \"kubernetes.io/projected/af68fd91-9826-405c-b2a7-d2ea31c49737-kube-api-access-489qt\") pod \"volume-data-source-validator-7c6cbb6c87-m7bf5\" (UID: \"af68fd91-9826-405c-b2a7-d2ea31c49737\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" Apr 22 15:10:39.834895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9596l\" (UniqueName: \"kubernetes.io/projected/09812955-b6a7-49e2-9f95-13ab00645d14-kube-api-access-9596l\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-config\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834945 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.834989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-snapshots\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3013d5-791d-4be7-9302-a69cf49ab049-serving-cert\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fh5r\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.835130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfnv2\" (UniqueName: \"kubernetes.io/projected/ca3013d5-791d-4be7-9302-a69cf49ab049-kube-api-access-hfnv2\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.835433 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-service-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.835433 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-tmp\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.835433 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-trusted-ca\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.835525 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.835463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fe0a454-c595-4c12-b2ff-afc448fddec1-config\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.836916 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.836898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8fe0a454-c595-4c12-b2ff-afc448fddec1-serving-cert\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.847434 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.847416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72vp4\" (UniqueName: \"kubernetes.io/projected/8fe0a454-c595-4c12-b2ff-afc448fddec1-kube-api-access-72vp4\") pod \"console-operator-9d4b6777b-jztj7\" (UID: \"8fe0a454-c595-4c12-b2ff-afc448fddec1\") " pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:39.861046 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.861021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-489qt\" (UniqueName: \"kubernetes.io/projected/af68fd91-9826-405c-b2a7-d2ea31c49737-kube-api-access-489qt\") pod \"volume-data-source-validator-7c6cbb6c87-m7bf5\" (UID: \"af68fd91-9826-405c-b2a7-d2ea31c49737\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" Apr 22 15:10:39.936390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-service-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.936390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-tmp\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.936390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9070e43c-98ec-4211-8d57-0154d4934914-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-594tc\" (UniqueName: \"kubernetes.io/projected/9070e43c-98ec-4211-8d57-0154d4934914-kube-api-access-594tc\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936431 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgff4\" (UniqueName: \"kubernetes.io/projected/51174689-eee9-47c4-95c1-890adba68f5a-kube-api-access-kgff4\") pod \"network-check-source-8894fc9bd-g7h5c\" (UID: \"51174689-eee9-47c4-95c1-890adba68f5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936453 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09812955-b6a7-49e2-9f95-13ab00645d14-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936508 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936623 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936566 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09812955-b6a7-49e2-9f95-13ab00645d14-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:39.936726 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:39.936745 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd4d896fc-ltlnc: secret "image-registry-tls" not found Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9596l\" (UniqueName: \"kubernetes.io/projected/09812955-b6a7-49e2-9f95-13ab00645d14-kube-api-access-9596l\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:39.936824 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls podName:aefab03a-4e84-4f7e-9778-d0da04255bfd nodeName:}" failed. No retries permitted until 2026-04-22 15:10:40.43680492 +0000 UTC m=+123.223441975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls") pod "image-registry-6fd4d896fc-ltlnc" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd") : secret "image-registry-tls" not found Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-snapshots\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.936935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3013d5-791d-4be7-9302-a69cf49ab049-serving-cert\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.937529 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-service-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.937529 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fh5r\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.937529 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.936991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfnv2\" (UniqueName: \"kubernetes.io/projected/ca3013d5-791d-4be7-9302-a69cf49ab049-kube-api-access-hfnv2\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.937529 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.937229 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.937529 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.937502 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.937807 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.937785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-tmp\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.937890 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:39.937876 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:39.937936 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:39.937927 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls podName:9070e43c-98ec-4211-8d57-0154d4934914 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:40.437912054 +0000 UTC m=+123.224549104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6p57k" (UID: "9070e43c-98ec-4211-8d57-0154d4934914") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:39.938241 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.938215 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3013d5-791d-4be7-9302-a69cf49ab049-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.938494 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.938472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.938582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.938506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca3013d5-791d-4be7-9302-a69cf49ab049-snapshots\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.938582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.938539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09812955-b6a7-49e2-9f95-13ab00645d14-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.938723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.938703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9070e43c-98ec-4211-8d57-0154d4934914-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.939251 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.939230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09812955-b6a7-49e2-9f95-13ab00645d14-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.939619 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.939597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.940047 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.940030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3013d5-791d-4be7-9302-a69cf49ab049-serving-cert\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.940712 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.940696 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.947796 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.947775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-594tc\" (UniqueName: \"kubernetes.io/projected/9070e43c-98ec-4211-8d57-0154d4934914-kube-api-access-594tc\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:39.948284 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.948260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgff4\" (UniqueName: \"kubernetes.io/projected/51174689-eee9-47c4-95c1-890adba68f5a-kube-api-access-kgff4\") pod \"network-check-source-8894fc9bd-g7h5c\" (UID: \"51174689-eee9-47c4-95c1-890adba68f5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" Apr 22 15:10:39.948545 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.948523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.948843 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.948824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfnv2\" (UniqueName: \"kubernetes.io/projected/ca3013d5-791d-4be7-9302-a69cf49ab049-kube-api-access-hfnv2\") pod \"insights-operator-585dfdc468-w75nz\" (UID: \"ca3013d5-791d-4be7-9302-a69cf49ab049\") " pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:39.952491 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.949467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fh5r\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:39.952491 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.949480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9596l\" (UniqueName: \"kubernetes.io/projected/09812955-b6a7-49e2-9f95-13ab00645d14-kube-api-access-9596l\") pod \"kube-storage-version-migrator-operator-6769c5d45-gjmkn\" (UID: \"09812955-b6a7-49e2-9f95-13ab00645d14\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:39.970416 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.970395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" Apr 22 15:10:39.975084 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:39.975067 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:40.073473 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.073441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" Apr 22 15:10:40.080610 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.080585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" Apr 22 15:10:40.086725 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.086695 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w75nz" Apr 22 15:10:40.088608 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.088532 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5"] Apr 22 15:10:40.092855 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:40.092824 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf68fd91_9826_405c_b2a7_d2ea31c49737.slice/crio-685dc27665a7f30f7a85c39d276a5c40ca49b1d5a8cf26f2e48c65a2e5db0033 WatchSource:0}: Error finding container 685dc27665a7f30f7a85c39d276a5c40ca49b1d5a8cf26f2e48c65a2e5db0033: Status 404 returned error can't find the container with id 685dc27665a7f30f7a85c39d276a5c40ca49b1d5a8cf26f2e48c65a2e5db0033 Apr 22 15:10:40.098583 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.098515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" event={"ID":"af68fd91-9826-405c-b2a7-d2ea31c49737","Type":"ContainerStarted","Data":"685dc27665a7f30f7a85c39d276a5c40ca49b1d5a8cf26f2e48c65a2e5db0033"} Apr 22 15:10:40.107663 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.107618 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-jztj7"] Apr 22 15:10:40.111555 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:40.111528 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fe0a454_c595_4c12_b2ff_afc448fddec1.slice/crio-30b794f0e77e701b670be698c39cb8d86b49f8d8ce5e769c5b78a203b6d0cb72 WatchSource:0}: Error finding container 30b794f0e77e701b670be698c39cb8d86b49f8d8ce5e769c5b78a203b6d0cb72: Status 404 returned error can't find the container with id 30b794f0e77e701b670be698c39cb8d86b49f8d8ce5e769c5b78a203b6d0cb72 Apr 22 15:10:40.207515 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.207422 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c"] Apr 22 15:10:40.212655 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:40.212629 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51174689_eee9_47c4_95c1_890adba68f5a.slice/crio-3373f546eca22b1b67e9640bf14fe7f00557dde85425238e507198ff1d663338 WatchSource:0}: Error finding container 3373f546eca22b1b67e9640bf14fe7f00557dde85425238e507198ff1d663338: Status 404 returned error can't find the container with id 3373f546eca22b1b67e9640bf14fe7f00557dde85425238e507198ff1d663338 Apr 22 15:10:40.428171 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.428140 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn"] Apr 22 15:10:40.431073 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:40.431047 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09812955_b6a7_49e2_9f95_13ab00645d14.slice/crio-73c9e16d9c017d030fe3f155b3c822bb3467c2c2182db2666385d3deccad2919 WatchSource:0}: Error finding container 73c9e16d9c017d030fe3f155b3c822bb3467c2c2182db2666385d3deccad2919: Status 404 returned error can't find the container with id 73c9e16d9c017d030fe3f155b3c822bb3467c2c2182db2666385d3deccad2919 Apr 22 15:10:40.431275 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.431248 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w75nz"] Apr 22 15:10:40.434805 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:40.434780 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3013d5_791d_4be7_9302_a69cf49ab049.slice/crio-64d97efa6f75e50e97145243aa6bc9e540b5667e13746a078ad7ac55c898795a WatchSource:0}: Error finding container 64d97efa6f75e50e97145243aa6bc9e540b5667e13746a078ad7ac55c898795a: Status 404 returned error can't find the container with id 64d97efa6f75e50e97145243aa6bc9e540b5667e13746a078ad7ac55c898795a Apr 22 15:10:40.440780 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.440762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:40.440868 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:40.440814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:40.440923 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:40.440908 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:40.440981 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:40.440917 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:10:40.440981 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:40.440936 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd4d896fc-ltlnc: secret "image-registry-tls" not found Apr 22 15:10:40.440981 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:40.440960 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls podName:9070e43c-98ec-4211-8d57-0154d4934914 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:41.440946719 +0000 UTC m=+124.227583757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6p57k" (UID: "9070e43c-98ec-4211-8d57-0154d4934914") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:40.441094 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:40.440984 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls podName:aefab03a-4e84-4f7e-9778-d0da04255bfd nodeName:}" failed. No retries permitted until 2026-04-22 15:10:41.440966081 +0000 UTC m=+124.227603124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls") pod "image-registry-6fd4d896fc-ltlnc" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd") : secret "image-registry-tls" not found Apr 22 15:10:41.103683 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.103633 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" event={"ID":"09812955-b6a7-49e2-9f95-13ab00645d14","Type":"ContainerStarted","Data":"73c9e16d9c017d030fe3f155b3c822bb3467c2c2182db2666385d3deccad2919"} Apr 22 15:10:41.105029 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.104987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" event={"ID":"8fe0a454-c595-4c12-b2ff-afc448fddec1","Type":"ContainerStarted","Data":"30b794f0e77e701b670be698c39cb8d86b49f8d8ce5e769c5b78a203b6d0cb72"} Apr 22 15:10:41.106266 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.106225 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w75nz" event={"ID":"ca3013d5-791d-4be7-9302-a69cf49ab049","Type":"ContainerStarted","Data":"64d97efa6f75e50e97145243aa6bc9e540b5667e13746a078ad7ac55c898795a"} Apr 22 15:10:41.108105 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.108080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" event={"ID":"51174689-eee9-47c4-95c1-890adba68f5a","Type":"ContainerStarted","Data":"f1bba31cb8d5b76fcc341584a58cca238afa4541ec2d62afee8498fd8a363e1f"} Apr 22 15:10:41.108105 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.108115 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" event={"ID":"51174689-eee9-47c4-95c1-890adba68f5a","Type":"ContainerStarted","Data":"3373f546eca22b1b67e9640bf14fe7f00557dde85425238e507198ff1d663338"} Apr 22 15:10:41.124345 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.124294 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g7h5c" podStartSLOduration=2.12427768 podStartE2EDuration="2.12427768s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:10:41.123704513 +0000 UTC m=+123.910341577" watchObservedRunningTime="2026-04-22 15:10:41.12427768 +0000 UTC m=+123.910914743" Apr 22 15:10:41.450114 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.450025 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:41.450315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:41.450140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:41.450315 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:41.450233 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:41.450424 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:41.450315 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:10:41.450424 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:41.450331 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd4d896fc-ltlnc: secret "image-registry-tls" not found Apr 22 15:10:41.450424 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:41.450373 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls podName:9070e43c-98ec-4211-8d57-0154d4934914 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:43.45029159 +0000 UTC m=+126.236928644 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6p57k" (UID: "9070e43c-98ec-4211-8d57-0154d4934914") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:41.450424 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:41.450399 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls podName:aefab03a-4e84-4f7e-9778-d0da04255bfd nodeName:}" failed. No retries permitted until 2026-04-22 15:10:43.450383679 +0000 UTC m=+126.237020723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls") pod "image-registry-6fd4d896fc-ltlnc" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd") : secret "image-registry-tls" not found Apr 22 15:10:42.112357 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:42.112315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" event={"ID":"af68fd91-9826-405c-b2a7-d2ea31c49737","Type":"ContainerStarted","Data":"a133780d6fed8f44ff3ff517a68058802b1c5bd8632b7bc63901797dd37a7b34"} Apr 22 15:10:42.128692 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:42.128640 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-m7bf5" podStartSLOduration=1.614479403 podStartE2EDuration="3.128623351s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:40.094499705 +0000 UTC m=+122.881136743" lastFinishedPulling="2026-04-22 15:10:41.608643638 +0000 UTC m=+124.395280691" observedRunningTime="2026-04-22 15:10:42.127625696 +0000 UTC m=+124.914262780" watchObservedRunningTime="2026-04-22 15:10:42.128623351 +0000 UTC m=+124.915260411" Apr 22 15:10:43.465061 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:43.465021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:43.465175 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:43.465185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:43.465259 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd4d896fc-ltlnc: secret "image-registry-tls" not found Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:43.465328 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls podName:aefab03a-4e84-4f7e-9778-d0da04255bfd nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.465308212 +0000 UTC m=+130.251945271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls") pod "image-registry-6fd4d896fc-ltlnc" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd") : secret "image-registry-tls" not found Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:43.465339 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:43.465457 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:43.465393 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls podName:9070e43c-98ec-4211-8d57-0154d4934914 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:47.465373958 +0000 UTC m=+130.252010999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6p57k" (UID: "9070e43c-98ec-4211-8d57-0154d4934914") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:44.118398 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.118354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" event={"ID":"09812955-b6a7-49e2-9f95-13ab00645d14","Type":"ContainerStarted","Data":"eec4c3339ae636b12120d35409a0550705658a435fb50cece4d3c33c9408a88c"} Apr 22 15:10:44.119803 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.119781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/0.log" Apr 22 15:10:44.119917 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.119816 2577 generic.go:358] "Generic (PLEG): container finished" podID="8fe0a454-c595-4c12-b2ff-afc448fddec1" containerID="00b1046989b2c3ae4cf85302794f8cde105f13d221e6abbf32c8c38d970c0535" exitCode=255 Apr 22 15:10:44.119917 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.119879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" event={"ID":"8fe0a454-c595-4c12-b2ff-afc448fddec1","Type":"ContainerDied","Data":"00b1046989b2c3ae4cf85302794f8cde105f13d221e6abbf32c8c38d970c0535"} Apr 22 15:10:44.120127 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.120102 2577 scope.go:117] "RemoveContainer" containerID="00b1046989b2c3ae4cf85302794f8cde105f13d221e6abbf32c8c38d970c0535" Apr 22 15:10:44.121183 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.121153 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w75nz" event={"ID":"ca3013d5-791d-4be7-9302-a69cf49ab049","Type":"ContainerStarted","Data":"a260241b59d23a9b33c73e2f9bd57707e8700f068a6028dac1bc4ca39c0eee8a"} Apr 22 15:10:44.133633 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.133578 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" podStartSLOduration=2.112031299 podStartE2EDuration="5.133564713s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:40.433142403 +0000 UTC m=+123.219779442" lastFinishedPulling="2026-04-22 15:10:43.454675818 +0000 UTC m=+126.241312856" observedRunningTime="2026-04-22 15:10:44.132550513 +0000 UTC m=+126.919187574" watchObservedRunningTime="2026-04-22 15:10:44.133564713 +0000 UTC m=+126.920201774" Apr 22 15:10:44.166474 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:44.166423 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-w75nz" podStartSLOduration=2.152456927 podStartE2EDuration="5.166408519s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:40.436389801 +0000 UTC m=+123.223026839" lastFinishedPulling="2026-04-22 15:10:43.450341379 +0000 UTC m=+126.236978431" observedRunningTime="2026-04-22 15:10:44.165509053 +0000 UTC m=+126.952146115" watchObservedRunningTime="2026-04-22 15:10:44.166408519 +0000 UTC m=+126.953045578" Apr 22 15:10:45.125375 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.125350 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:10:45.125767 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.125707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/0.log" Apr 22 15:10:45.125767 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.125736 2577 generic.go:358] "Generic (PLEG): container finished" podID="8fe0a454-c595-4c12-b2ff-afc448fddec1" containerID="6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce" exitCode=255 Apr 22 15:10:45.125866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.125839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" event={"ID":"8fe0a454-c595-4c12-b2ff-afc448fddec1","Type":"ContainerDied","Data":"6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce"} Apr 22 15:10:45.125920 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.125891 2577 scope.go:117] "RemoveContainer" containerID="00b1046989b2c3ae4cf85302794f8cde105f13d221e6abbf32c8c38d970c0535" Apr 22 15:10:45.126044 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:45.126026 2577 scope.go:117] "RemoveContainer" containerID="6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce" Apr 22 15:10:45.126285 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:45.126262 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jztj7_openshift-console-operator(8fe0a454-c595-4c12-b2ff-afc448fddec1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" podUID="8fe0a454-c595-4c12-b2ff-afc448fddec1" Apr 22 15:10:46.129927 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:46.129901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:10:46.130424 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:46.130220 2577 scope.go:117] "RemoveContainer" containerID="6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce" Apr 22 15:10:46.130424 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:46.130384 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jztj7_openshift-console-operator(8fe0a454-c595-4c12-b2ff-afc448fddec1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" podUID="8fe0a454-c595-4c12-b2ff-afc448fddec1" Apr 22 15:10:46.581460 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:46.581434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4sxvb_031a7138-6b28-4cf1-9f28-ca9c3f9e3225/dns-node-resolver/0.log" Apr 22 15:10:47.498569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:47.498531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:47.498598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:47.498626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498694 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498725 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498729 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498744 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fd4d896fc-ltlnc: secret "image-registry-tls" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498766 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls podName:9070e43c-98ec-4211-8d57-0154d4934914 nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.498747474 +0000 UTC m=+138.285384513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6p57k" (UID: "9070e43c-98ec-4211-8d57-0154d4934914") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498780 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs podName:99c788ee-8bf0-4eb7-9e35-f464df2ca01e nodeName:}" failed. No retries permitted until 2026-04-22 15:12:49.498774062 +0000 UTC m=+252.285411100 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs") pod "network-metrics-daemon-9nk69" (UID: "99c788ee-8bf0-4eb7-9e35-f464df2ca01e") : secret "metrics-daemon-secret" not found Apr 22 15:10:47.498947 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:47.498790 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls podName:aefab03a-4e84-4f7e-9778-d0da04255bfd nodeName:}" failed. No retries permitted until 2026-04-22 15:10:55.498785137 +0000 UTC m=+138.285422174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls") pod "image-registry-6fd4d896fc-ltlnc" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd") : secret "image-registry-tls" not found Apr 22 15:10:47.980417 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:47.980386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vrpzt_d075efdc-d5f5-490a-a543-09e52a1f9e38/node-ca/0.log" Apr 22 15:10:49.975651 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:49.975622 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:49.976093 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:49.975664 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:10:49.976160 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:49.976110 2577 scope.go:117] "RemoveContainer" containerID="6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce" Apr 22 15:10:49.976348 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:10:49.976326 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-jztj7_openshift-console-operator(8fe0a454-c595-4c12-b2ff-afc448fddec1)\"" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" podUID="8fe0a454-c595-4c12-b2ff-afc448fddec1" Apr 22 15:10:55.559977 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.559938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:55.560370 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.560006 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:55.562323 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.562298 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9070e43c-98ec-4211-8d57-0154d4934914-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6p57k\" (UID: \"9070e43c-98ec-4211-8d57-0154d4934914\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:55.562436 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.562354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"image-registry-6fd4d896fc-ltlnc\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:55.693739 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.693710 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" Apr 22 15:10:55.699403 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.699380 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:55.839395 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.839322 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k"] Apr 22 15:10:55.842345 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:55.842317 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9070e43c_98ec_4211_8d57_0154d4934914.slice/crio-e5308f17eb9caffe5c802436bfb2ebc92574196e1b1766e906719dc2cc992b15 WatchSource:0}: Error finding container e5308f17eb9caffe5c802436bfb2ebc92574196e1b1766e906719dc2cc992b15: Status 404 returned error can't find the container with id e5308f17eb9caffe5c802436bfb2ebc92574196e1b1766e906719dc2cc992b15 Apr 22 15:10:55.852759 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:55.852738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:10:55.856445 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:10:55.856418 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaefab03a_4e84_4f7e_9778_d0da04255bfd.slice/crio-29caac25b73893a207a9c9332bcdd86a61a9b5e7dfb17247137acec2df5a69fb WatchSource:0}: Error finding container 29caac25b73893a207a9c9332bcdd86a61a9b5e7dfb17247137acec2df5a69fb: Status 404 returned error can't find the container with id 29caac25b73893a207a9c9332bcdd86a61a9b5e7dfb17247137acec2df5a69fb Apr 22 15:10:56.155001 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:56.154922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" event={"ID":"9070e43c-98ec-4211-8d57-0154d4934914","Type":"ContainerStarted","Data":"e5308f17eb9caffe5c802436bfb2ebc92574196e1b1766e906719dc2cc992b15"} Apr 22 15:10:56.156132 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:56.156108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c"} Apr 22 15:10:56.156237 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:56.156140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"29caac25b73893a207a9c9332bcdd86a61a9b5e7dfb17247137acec2df5a69fb"} Apr 22 15:10:56.156237 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:56.156227 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:10:56.175991 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:56.175948 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podStartSLOduration=17.175933387 podStartE2EDuration="17.175933387s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:10:56.175482294 +0000 UTC m=+138.962119364" watchObservedRunningTime="2026-04-22 15:10:56.175933387 +0000 UTC m=+138.962570440" Apr 22 15:10:58.162710 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:58.162674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" event={"ID":"9070e43c-98ec-4211-8d57-0154d4934914","Type":"ContainerStarted","Data":"d6bcaa14e9a3c5f473d2d408fb5006c321db84ba5a5af212cdf62dd912eb4430"} Apr 22 15:10:58.179046 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:10:58.179000 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6p57k" podStartSLOduration=17.58894421 podStartE2EDuration="19.178987332s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:55.843991141 +0000 UTC m=+138.630628178" lastFinishedPulling="2026-04-22 15:10:57.43403426 +0000 UTC m=+140.220671300" observedRunningTime="2026-04-22 15:10:58.178521816 +0000 UTC m=+140.965158889" watchObservedRunningTime="2026-04-22 15:10:58.178987332 +0000 UTC m=+140.965624392" Apr 22 15:11:00.799925 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:00.799895 2577 scope.go:117] "RemoveContainer" containerID="6d4df02f86089430deee27100c8f0477d33875a4a83d6105d3e19e0f069b25ce" Apr 22 15:11:01.171503 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:01.171426 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:11:01.171636 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:01.171520 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" event={"ID":"8fe0a454-c595-4c12-b2ff-afc448fddec1","Type":"ContainerStarted","Data":"a0db44e6a0cc44c40a67e398411bb0717f5437b0acc0863e4d7894c50a8d714a"} Apr 22 15:11:01.171832 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:01.171799 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:11:01.193343 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:01.193291 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" podStartSLOduration=18.862674852 podStartE2EDuration="22.193278219s" podCreationTimestamp="2026-04-22 15:10:39 +0000 UTC" firstStartedPulling="2026-04-22 15:10:40.113999859 +0000 UTC m=+122.900636898" lastFinishedPulling="2026-04-22 15:10:43.44460321 +0000 UTC m=+126.231240265" observedRunningTime="2026-04-22 15:11:01.192333622 +0000 UTC m=+143.978970739" watchObservedRunningTime="2026-04-22 15:11:01.193278219 +0000 UTC m=+143.979915278" Apr 22 15:11:01.267885 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:01.267857 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-jztj7" Apr 22 15:11:13.612810 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:11:13.612770 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-9thxk" podUID="99b60141-c5a1-4685-b0c9-f59380bb89b8" Apr 22 15:11:13.615943 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:11:13.615920 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-nlf5r" podUID="c6c9ff67-fc53-4fad-bac9-aa152e2c0640" Apr 22 15:11:13.817587 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:11:13.817552 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-9nk69" podUID="99c788ee-8bf0-4eb7-9e35-f464df2ca01e" Apr 22 15:11:14.202439 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:14.202406 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:15.703208 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:15.703173 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:15.703572 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:15.703250 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:17.163244 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:17.163215 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:17.163608 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:17.163275 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:18.529522 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.529490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:18.529522 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.529526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:11:18.531796 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.531766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c6c9ff67-fc53-4fad-bac9-aa152e2c0640-cert\") pod \"ingress-canary-nlf5r\" (UID: \"c6c9ff67-fc53-4fad-bac9-aa152e2c0640\") " pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:11:18.532322 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.532304 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99b60141-c5a1-4685-b0c9-f59380bb89b8-metrics-tls\") pod \"dns-default-9thxk\" (UID: \"99b60141-c5a1-4685-b0c9-f59380bb89b8\") " pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:18.710107 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.710077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkblf\"" Apr 22 15:11:18.714203 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.714176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:18.849321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:18.849290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9thxk"] Apr 22 15:11:18.852289 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:11:18.852252 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99b60141_c5a1_4685_b0c9_f59380bb89b8.slice/crio-65c04085864869f01ed55d8f0e7f2666009e8482e5471c0d15eecd0380bf00ae WatchSource:0}: Error finding container 65c04085864869f01ed55d8f0e7f2666009e8482e5471c0d15eecd0380bf00ae: Status 404 returned error can't find the container with id 65c04085864869f01ed55d8f0e7f2666009e8482e5471c0d15eecd0380bf00ae Apr 22 15:11:19.214710 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:19.214629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9thxk" event={"ID":"99b60141-c5a1-4685-b0c9-f59380bb89b8","Type":"ContainerStarted","Data":"65c04085864869f01ed55d8f0e7f2666009e8482e5471c0d15eecd0380bf00ae"} Apr 22 15:11:21.221039 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:21.221005 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9thxk" event={"ID":"99b60141-c5a1-4685-b0c9-f59380bb89b8","Type":"ContainerStarted","Data":"a2292693f80b5167c3595dbebbd35fb6f319bcdf0928977bb37a927e1c796136"} Apr 22 15:11:21.221039 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:21.221038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9thxk" event={"ID":"99b60141-c5a1-4685-b0c9-f59380bb89b8","Type":"ContainerStarted","Data":"e1580b293a9c40a759f16e36798f7d25f12ab7f8f14843f4e5dc00d400a4adbd"} Apr 22 15:11:21.221483 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:21.221140 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:21.241417 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:21.241372 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9thxk" podStartSLOduration=129.770563427 podStartE2EDuration="2m11.241360031s" podCreationTimestamp="2026-04-22 15:09:10 +0000 UTC" firstStartedPulling="2026-04-22 15:11:18.854030058 +0000 UTC m=+161.640667099" lastFinishedPulling="2026-04-22 15:11:20.324826665 +0000 UTC m=+163.111463703" observedRunningTime="2026-04-22 15:11:21.240007963 +0000 UTC m=+164.026645022" watchObservedRunningTime="2026-04-22 15:11:21.241360031 +0000 UTC m=+164.027997148" Apr 22 15:11:25.703538 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:25.703495 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:25.703929 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:25.703555 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:26.799672 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:26.799627 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:11:27.164053 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:27.163973 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:27.164053 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:27.164025 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:28.799354 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:28.799323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:11:28.803034 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:28.803011 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cbqpc\"" Apr 22 15:11:28.809834 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:28.809817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nlf5r" Apr 22 15:11:28.956229 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:28.956168 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nlf5r"] Apr 22 15:11:28.958788 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:11:28.958756 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c9ff67_fc53_4fad_bac9_aa152e2c0640.slice/crio-45924ac390854c1b20c18ca6df819d5d441c21c383dbe7d643bf2fa984f66e7c WatchSource:0}: Error finding container 45924ac390854c1b20c18ca6df819d5d441c21c383dbe7d643bf2fa984f66e7c: Status 404 returned error can't find the container with id 45924ac390854c1b20c18ca6df819d5d441c21c383dbe7d643bf2fa984f66e7c Apr 22 15:11:29.243604 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:29.243518 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nlf5r" event={"ID":"c6c9ff67-fc53-4fad-bac9-aa152e2c0640","Type":"ContainerStarted","Data":"45924ac390854c1b20c18ca6df819d5d441c21c383dbe7d643bf2fa984f66e7c"} Apr 22 15:11:31.226329 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:31.226304 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9thxk" Apr 22 15:11:31.250140 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:31.250109 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nlf5r" event={"ID":"c6c9ff67-fc53-4fad-bac9-aa152e2c0640","Type":"ContainerStarted","Data":"5634ad7f46e6014467293f29198f014b4fa9fb58205543d107d32c26d070a989"} Apr 22 15:11:31.271367 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:31.271325 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nlf5r" podStartSLOduration=139.799857331 podStartE2EDuration="2m21.271313183s" podCreationTimestamp="2026-04-22 15:09:10 +0000 UTC" firstStartedPulling="2026-04-22 15:11:28.960589627 +0000 UTC m=+171.747226668" lastFinishedPulling="2026-04-22 15:11:30.432045482 +0000 UTC m=+173.218682520" observedRunningTime="2026-04-22 15:11:31.270752358 +0000 UTC m=+174.057389417" watchObservedRunningTime="2026-04-22 15:11:31.271313183 +0000 UTC m=+174.057950243" Apr 22 15:11:35.703592 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.703558 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:35.704044 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.703612 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:35.704044 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.703658 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:11:35.704238 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.704185 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:11:35.707354 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.707330 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:35.707456 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:35.707377 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:45.707414 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:45.707380 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:45.707888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:45.707449 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:11:55.708233 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:55.708189 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:11:55.708598 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:11:55.708253 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:00.722094 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:00.722047 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c" gracePeriod=30 Apr 22 15:12:01.323217 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:01.323168 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c" exitCode=0 Apr 22 15:12:01.323374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:01.323249 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c"} Apr 22 15:12:01.323374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:01.323293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d"} Apr 22 15:12:01.323469 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:01.323457 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:12:05.334544 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:05.334510 2577 generic.go:358] "Generic (PLEG): container finished" podID="09812955-b6a7-49e2-9f95-13ab00645d14" containerID="eec4c3339ae636b12120d35409a0550705658a435fb50cece4d3c33c9408a88c" exitCode=0 Apr 22 15:12:05.334911 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:05.334584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" event={"ID":"09812955-b6a7-49e2-9f95-13ab00645d14","Type":"ContainerDied","Data":"eec4c3339ae636b12120d35409a0550705658a435fb50cece4d3c33c9408a88c"} Apr 22 15:12:05.334911 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:05.334899 2577 scope.go:117] "RemoveContainer" containerID="eec4c3339ae636b12120d35409a0550705658a435fb50cece4d3c33c9408a88c" Apr 22 15:12:06.338490 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:06.338457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gjmkn" event={"ID":"09812955-b6a7-49e2-9f95-13ab00645d14","Type":"ContainerStarted","Data":"631e16b724d143c8c22a2f2763b4b2e44d32c5c2992e39819c2b3a8f24d8e94d"} Apr 22 15:12:09.346603 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:09.346568 2577 generic.go:358] "Generic (PLEG): container finished" podID="ca3013d5-791d-4be7-9302-a69cf49ab049" containerID="a260241b59d23a9b33c73e2f9bd57707e8700f068a6028dac1bc4ca39c0eee8a" exitCode=0 Apr 22 15:12:09.346953 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:09.346613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w75nz" event={"ID":"ca3013d5-791d-4be7-9302-a69cf49ab049","Type":"ContainerDied","Data":"a260241b59d23a9b33c73e2f9bd57707e8700f068a6028dac1bc4ca39c0eee8a"} Apr 22 15:12:09.346953 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:09.346944 2577 scope.go:117] "RemoveContainer" containerID="a260241b59d23a9b33c73e2f9bd57707e8700f068a6028dac1bc4ca39c0eee8a" Apr 22 15:12:10.350279 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:10.350241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w75nz" event={"ID":"ca3013d5-791d-4be7-9302-a69cf49ab049","Type":"ContainerStarted","Data":"d7f87cce290ecb345808ddded7022d1992307f1248100e971c71268cb89e463e"} Apr 22 15:12:12.551152 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:12.551121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:12:12.752944 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:12.752907 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/2.log" Apr 22 15:12:12.951816 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:12.951739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9thxk_99b60141-c5a1-4685-b0c9-f59380bb89b8/dns/0.log" Apr 22 15:12:13.151425 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:13.151398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9thxk_99b60141-c5a1-4685-b0c9-f59380bb89b8/kube-rbac-proxy/0.log" Apr 22 15:12:14.151433 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:14.151401 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4sxvb_031a7138-6b28-4cf1-9f28-ca9c3f9e3225/dns-node-resolver/0.log" Apr 22 15:12:15.151995 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:15.151921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6fd4d896fc-ltlnc_aefab03a-4e84-4f7e-9778-d0da04255bfd/registry/0.log" Apr 22 15:12:15.350614 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:15.350583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6fd4d896fc-ltlnc_aefab03a-4e84-4f7e-9778-d0da04255bfd/registry/1.log" Apr 22 15:12:15.703578 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:15.703548 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:15.703724 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:15.703602 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:15.952306 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:15.952275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vrpzt_d075efdc-d5f5-490a-a543-09e52a1f9e38/node-ca/0.log" Apr 22 15:12:16.751571 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:16.751538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nlf5r_c6c9ff67-fc53-4fad-bac9-aa152e2c0640/serve-healthcheck-canary/0.log" Apr 22 15:12:17.352646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:17.352617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gjmkn_09812955-b6a7-49e2-9f95-13ab00645d14/kube-storage-version-migrator-operator/0.log" Apr 22 15:12:17.552186 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:17.552157 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gjmkn_09812955-b6a7-49e2-9f95-13ab00645d14/kube-storage-version-migrator-operator/1.log" Apr 22 15:12:22.330520 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:22.330488 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:22.330871 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:22.330548 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:25.703172 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:25.703138 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:25.703560 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:25.703190 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:32.330612 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:32.330576 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:32.330988 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:32.330629 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:35.703254 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.703222 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:35.703671 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.703270 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:35.703671 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.703306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:12:35.703798 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.703676 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:12:35.706888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.706860 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:35.707005 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:35.706910 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:45.707378 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:45.707303 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:45.707378 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:45.707365 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:12:49.568328 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:49.568282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:12:49.570542 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:49.570524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c788ee-8bf0-4eb7-9e35-f464df2ca01e-metrics-certs\") pod \"network-metrics-daemon-9nk69\" (UID: \"99c788ee-8bf0-4eb7-9e35-f464df2ca01e\") " pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:12:49.602498 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:49.602471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-62z2n\"" Apr 22 15:12:49.610141 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:49.610122 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-9nk69" Apr 22 15:12:49.720438 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:49.720407 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-9nk69"] Apr 22 15:12:49.723385 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:12:49.723355 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c788ee_8bf0_4eb7_9e35_f464df2ca01e.slice/crio-390db0d2f39319f343a18f6cc8015cbdab36dc29311551c1a64acf91992342ed WatchSource:0}: Error finding container 390db0d2f39319f343a18f6cc8015cbdab36dc29311551c1a64acf91992342ed: Status 404 returned error can't find the container with id 390db0d2f39319f343a18f6cc8015cbdab36dc29311551c1a64acf91992342ed Apr 22 15:12:50.447313 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:50.447270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9nk69" event={"ID":"99c788ee-8bf0-4eb7-9e35-f464df2ca01e","Type":"ContainerStarted","Data":"390db0d2f39319f343a18f6cc8015cbdab36dc29311551c1a64acf91992342ed"} Apr 22 15:12:51.451730 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:51.451694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9nk69" event={"ID":"99c788ee-8bf0-4eb7-9e35-f464df2ca01e","Type":"ContainerStarted","Data":"436ccb665a86f20ddcbea4839f4b6545491c9c86cbc4a873064ecb1f55fdfc9d"} Apr 22 15:12:51.451730 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:51.451733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-9nk69" event={"ID":"99c788ee-8bf0-4eb7-9e35-f464df2ca01e","Type":"ContainerStarted","Data":"1f060c0d77bd5c18756f697c4f2554781bbe9ae65410b5c992d4e98ac558baca"} Apr 22 15:12:51.469692 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:51.469651 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-9nk69" podStartSLOduration=253.437191469 podStartE2EDuration="4m14.469637552s" podCreationTimestamp="2026-04-22 15:08:37 +0000 UTC" firstStartedPulling="2026-04-22 15:12:49.725098698 +0000 UTC m=+252.511735735" lastFinishedPulling="2026-04-22 15:12:50.757544759 +0000 UTC m=+253.544181818" observedRunningTime="2026-04-22 15:12:51.468725686 +0000 UTC m=+254.255362747" watchObservedRunningTime="2026-04-22 15:12:51.469637552 +0000 UTC m=+254.256274612" Apr 22 15:12:55.707603 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:55.707572 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:12:55.707968 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:12:55.707624 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:00.721439 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:00.721387 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d" gracePeriod=30 Apr 22 15:13:01.477583 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:01.477548 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d" exitCode=0 Apr 22 15:13:01.477764 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:01.477624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d"} Apr 22 15:13:01.477764 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:01.477663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842"} Apr 22 15:13:01.477764 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:01.477682 2577 scope.go:117] "RemoveContainer" containerID="0f74f834cac41ef611da965d194afe76e6d0a2c80931b624b4e84e914f932a1c" Apr 22 15:13:01.477939 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:01.477836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:13:15.703482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:15.703451 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:15.703836 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:15.703499 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:22.489910 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:22.489877 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:22.490392 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:22.489935 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:25.704479 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:25.704438 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:25.704861 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:25.704494 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:32.490170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:32.490134 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:32.490543 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:32.490208 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:35.703515 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.703481 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:35.703901 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.703535 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:35.703901 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.703575 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:13:35.704003 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.703983 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:13:35.707154 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.707127 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:35.707300 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:35.707179 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:37.676381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:37.676353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:13:37.677608 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:37.677585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:13:37.679481 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:37.679463 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:13:37.680911 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:37.680890 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:13:37.685357 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:37.685340 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:13:45.708056 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:45.708019 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:45.709438 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:45.708078 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:13:55.707906 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:55.707868 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:13:55.708393 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:13:55.707931 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:00.721724 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:00.721686 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842" gracePeriod=30 Apr 22 15:14:00.837219 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:00.837182 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:14:01.634250 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:01.634217 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842" exitCode=0 Apr 22 15:14:01.634437 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:01.634283 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842"} Apr 22 15:14:01.634437 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:01.634308 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5"} Apr 22 15:14:01.634437 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:01.634323 2577 scope.go:117] "RemoveContainer" containerID="729921e8e42447a1f469b1bcdee697049cff7f69a9a253861e8a183e388b6e6d" Apr 22 15:14:01.634619 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:01.634464 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:14:15.703694 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:15.703617 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:15.703694 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:15.703674 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:22.642282 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:22.642248 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:22.642641 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:22.642301 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:25.704054 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:25.703772 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:25.704054 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:25.703896 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:32.642372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:32.642336 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:32.642899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:32.642403 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:35.703124 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.703092 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:35.703495 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.703148 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:35.703495 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.703189 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:14:35.703690 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.703671 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:14:35.706894 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.706870 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:35.707009 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:35.706908 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:45.707839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:45.707808 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:45.708304 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:45.707861 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:14:55.707483 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:55.707448 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:14:55.707845 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:14:55.707501 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:00.721603 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:00.721560 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5" gracePeriod=30 Apr 22 15:15:01.779785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:01.779748 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5" exitCode=0 Apr 22 15:15:01.780176 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:01.779814 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5"} Apr 22 15:15:01.780176 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:01.779850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682"} Apr 22 15:15:01.780176 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:01.779866 2577 scope.go:117] "RemoveContainer" containerID="6675fdfb07dad63ce7b1adebaae1c6e636ba7a72ab23359a63114af3fdba8842" Apr 22 15:15:01.780176 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:01.779997 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:15:15.703750 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:15.703716 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:15.704213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:15.703769 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:22.790265 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:22.790232 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:22.790718 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:22.790299 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:25.703357 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:25.703314 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:25.703706 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:25.703374 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:32.790824 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:32.790788 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:32.791190 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:32.790838 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:35.703232 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.703178 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:35.703620 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.703246 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:35.703620 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.703286 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:15:35.703718 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.703700 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:15:35.706955 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.706930 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:35.707075 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:35.706967 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:45.707308 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:45.707232 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:45.707678 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:45.707298 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:15:55.707711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:55.707676 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:15:55.708156 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:15:55.707729 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:00.721498 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.721452 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682" gracePeriod=30 Apr 22 15:16:00.928463 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.928433 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682" exitCode=0 Apr 22 15:16:00.928616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.928485 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682"} Apr 22 15:16:00.928616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.928511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerStarted","Data":"070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d"} Apr 22 15:16:00.928616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.928534 2577 scope.go:117] "RemoveContainer" containerID="f70dae41e51cbf0756e9710d9670eff303334d0bc137edebbab3f9037bd626f5" Apr 22 15:16:00.928782 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:00.928685 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:16:15.704103 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:15.704069 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:15.704492 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:15.704118 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:21.939299 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:21.939262 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:21.939678 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:21.939328 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:25.703702 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:25.703668 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:25.704066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:25.703727 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:31.939676 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:31.939641 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:31.940129 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:31.939694 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:35.703480 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.703440 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:35.703866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.703501 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:35.703866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.703550 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:16:35.704052 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.704028 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d"} pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" containerMessage="Container registry failed liveness probe, will be restarted" Apr 22 15:16:35.707489 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.707462 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:35.707618 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:35.707508 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:45.707846 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:45.707812 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:45.708312 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:45.707870 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:16:55.707950 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:55.707909 2577 patch_prober.go:28] interesting pod/image-registry-6fd4d896fc-ltlnc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 15:16:55.708430 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:16:55.707979 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 15:17:00.721935 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:00.721892 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" containerID="cri-o://070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" gracePeriod=30 Apr 22 15:17:00.832982 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:00.832958 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:17:01.077748 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:01.077717 2577 generic.go:358] "Generic (PLEG): container finished" podID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" exitCode=0 Apr 22 15:17:01.077904 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:01.077788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d"} Apr 22 15:17:01.077904 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:01.077829 2577 scope.go:117] "RemoveContainer" containerID="b6f1931e486d54dd482a9dd1e270af3edf763592f4ed349ff3fd959677129682" Apr 22 15:17:01.078178 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:01.078160 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:17:01.078394 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:01.078370 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:17:12.800211 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:12.800173 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:17:12.800599 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:12.800381 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:17:24.799592 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:24.799514 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:17:24.800018 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:24.799685 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:17:37.801169 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:37.801142 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:17:37.801632 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:37.801322 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:17:50.799844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:17:50.799813 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:17:50.800306 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:17:50.799991 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:18:01.799411 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:01.799381 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:18:01.799779 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:01.799606 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=registry pod=image-registry-6fd4d896fc-ltlnc_openshift-image-registry(aefab03a-4e84-4f7e-9778-d0da04255bfd)\"" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" Apr 22 15:18:05.199375 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.199343 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2l9ld"] Apr 22 15:18:05.202233 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.202217 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.206312 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.206288 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:18:05.207268 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.207236 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:18:05.207378 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.207271 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-c6286\"" Apr 22 15:18:05.217440 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.217419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2l9ld"] Apr 22 15:18:05.308424 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.308396 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7knnb"] Apr 22 15:18:05.311358 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.311340 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:05.315890 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.315859 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:18:05.317431 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.317409 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:18:05.319428 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.319408 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2vblf\"" Apr 22 15:18:05.323355 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.323335 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:05.326341 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.326319 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:18:05.326341 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.326351 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7htkt"] Apr 22 15:18:05.326518 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.326499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.329265 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.329241 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7knnb"] Apr 22 15:18:05.329407 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.329326 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.329927 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.329910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 15:18:05.330062 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.330045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 15:18:05.330161 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.330110 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 15:18:05.330464 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.330449 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 15:18:05.330569 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.330546 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 15:18:05.330669 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.330597 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-tkz4h\"" Apr 22 15:18:05.331885 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.331864 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 15:18:05.333872 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.333801 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 15:18:05.338074 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.338052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 15:18:05.338829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.338805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/672858ed-71c0-4480-881d-f921a89639d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.338962 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.338849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nxcp\" (UniqueName: \"kubernetes.io/projected/672858ed-71c0-4480-881d-f921a89639d3-kube-api-access-4nxcp\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.338962 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.338950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/672858ed-71c0-4480-881d-f921a89639d3-crio-socket\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.339066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.338994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/672858ed-71c0-4480-881d-f921a89639d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.339066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.339025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/672858ed-71c0-4480-881d-f921a89639d3-data-volume\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.343187 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.342859 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-6dxx4\"" Apr 22 15:18:05.344534 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.344512 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7htkt"] Apr 22 15:18:05.351164 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.350300 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:05.439415 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439507 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439507 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439452 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7frt\" (UniqueName: \"kubernetes.io/projected/7e2980bd-3504-4803-9825-ab03e37698f6-kube-api-access-z7frt\") pod \"downloads-6bcc868b7-7knnb\" (UID: \"7e2980bd-3504-4803-9825-ab03e37698f6\") " pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:05.439507 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/672858ed-71c0-4480-881d-f921a89639d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nxcp\" (UniqueName: \"kubernetes.io/projected/672858ed-71c0-4480-881d-f921a89639d3-kube-api-access-4nxcp\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psxf9\" (UniqueName: \"kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/672858ed-71c0-4480-881d-f921a89639d3-crio-socket\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/672858ed-71c0-4480-881d-f921a89639d3-crio-socket\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439646 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/672858ed-71c0-4480-881d-f921a89639d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4fb852c-ad08-434d-abea-ab07a5423921-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/672858ed-71c0-4480-881d-f921a89639d3-data-volume\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a4fb852c-ad08-434d-abea-ab07a5423921-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.439994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.440216 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.439996 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/672858ed-71c0-4480-881d-f921a89639d3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.440216 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.440136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/672858ed-71c0-4480-881d-f921a89639d3-data-volume\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.442035 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.441935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/672858ed-71c0-4480-881d-f921a89639d3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.449656 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.449603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nxcp\" (UniqueName: \"kubernetes.io/projected/672858ed-71c0-4480-881d-f921a89639d3-kube-api-access-4nxcp\") pod \"insights-runtime-extractor-2l9ld\" (UID: \"672858ed-71c0-4480-881d-f921a89639d3\") " pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.458030 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.458010 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:18:05.510770 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.510740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2l9ld" Apr 22 15:18:05.540948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.540922 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.540970 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541002 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541020 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541050 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541065 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541090 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541108 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fh5r\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r\") pod \"aefab03a-4e84-4f7e-9778-d0da04255bfd\" (UID: \"aefab03a-4e84-4f7e-9778-d0da04255bfd\") " Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541204 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541226 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4fb852c-ad08-434d-abea-ab07a5423921-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a4fb852c-ad08-434d-abea-ab07a5423921-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.541269 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541472 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541472 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541472 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7frt\" (UniqueName: \"kubernetes.io/projected/7e2980bd-3504-4803-9825-ab03e37698f6-kube-api-access-z7frt\") pod \"downloads-6bcc868b7-7knnb\" (UID: \"7e2980bd-3504-4803-9825-ab03e37698f6\") " pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:05.541472 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.541472 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.541358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psxf9\" (UniqueName: \"kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.543591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.542460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.543591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.542752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.543591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.542862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a4fb852c-ad08-434d-abea-ab07a5423921-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.543591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.543008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.543591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.543545 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:05.543947 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.543683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.543947 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.543683 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:05.544420 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.544393 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:18:05.545773 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.545748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.545859 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.545840 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:18:05.546429 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.546390 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:18:05.546537 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.546514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a4fb852c-ad08-434d-abea-ab07a5423921-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-7htkt\" (UID: \"a4fb852c-ad08-434d-abea-ab07a5423921\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.546594 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.546530 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:18:05.546773 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.546748 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.547484 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.547464 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r" (OuterVolumeSpecName: "kube-api-access-4fh5r") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "kube-api-access-4fh5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:18:05.552433 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.552411 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aefab03a-4e84-4f7e-9778-d0da04255bfd" (UID: "aefab03a-4e84-4f7e-9778-d0da04255bfd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:18:05.564483 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.564435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psxf9\" (UniqueName: \"kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9\") pod \"console-758f9c8856-gpqgw\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.566707 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.566688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7frt\" (UniqueName: \"kubernetes.io/projected/7e2980bd-3504-4803-9825-ab03e37698f6-kube-api-access-z7frt\") pod \"downloads-6bcc868b7-7knnb\" (UID: \"7e2980bd-3504-4803-9825-ab03e37698f6\") " pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:05.620990 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.620961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:05.639744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.639717 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:05.642702 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642678 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-installation-pull-secrets\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642706 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-bound-sa-token\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642723 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aefab03a-4e84-4f7e-9778-d0da04255bfd-image-registry-private-configuration\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642737 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4fh5r\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-kube-api-access-4fh5r\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642751 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-trusted-ca\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642764 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-certificates\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642777 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aefab03a-4e84-4f7e-9778-d0da04255bfd-registry-tls\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.642829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.642791 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aefab03a-4e84-4f7e-9778-d0da04255bfd-ca-trust-extracted\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:05.651294 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.651274 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" Apr 22 15:18:05.659070 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.659040 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2l9ld"] Apr 22 15:18:05.769823 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.769757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7knnb"] Apr 22 15:18:05.775342 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:05.775308 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e2980bd_3504_4803_9825_ab03e37698f6.slice/crio-1b970fe4e10ba19bd7cf2838a58307ea8579cc6d5735a5a62a8849256cb1abd6 WatchSource:0}: Error finding container 1b970fe4e10ba19bd7cf2838a58307ea8579cc6d5735a5a62a8849256cb1abd6: Status 404 returned error can't find the container with id 1b970fe4e10ba19bd7cf2838a58307ea8579cc6d5735a5a62a8849256cb1abd6 Apr 22 15:18:05.793729 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.793702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:05.796297 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:05.796271 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19c9b964_41f6_4e08_a813_5fe686840c29.slice/crio-2aee99bb6184b12cde1d6f96e1e527eb4f866de825d447b6e3b5cd9a93ad08e0 WatchSource:0}: Error finding container 2aee99bb6184b12cde1d6f96e1e527eb4f866de825d447b6e3b5cd9a93ad08e0: Status 404 returned error can't find the container with id 2aee99bb6184b12cde1d6f96e1e527eb4f866de825d447b6e3b5cd9a93ad08e0 Apr 22 15:18:05.808599 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:05.808553 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-7htkt"] Apr 22 15:18:05.811566 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:05.811462 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4fb852c_ad08_434d_abea_ab07a5423921.slice/crio-264b83f18cfa74d9b405feba7b38055fa66d3e0f30c311b47c6da671d2146700 WatchSource:0}: Error finding container 264b83f18cfa74d9b405feba7b38055fa66d3e0f30c311b47c6da671d2146700: Status 404 returned error can't find the container with id 264b83f18cfa74d9b405feba7b38055fa66d3e0f30c311b47c6da671d2146700 Apr 22 15:18:06.241897 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.241856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" event={"ID":"a4fb852c-ad08-434d-abea-ab07a5423921","Type":"ContainerStarted","Data":"264b83f18cfa74d9b405feba7b38055fa66d3e0f30c311b47c6da671d2146700"} Apr 22 15:18:06.243042 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.243016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758f9c8856-gpqgw" event={"ID":"19c9b964-41f6-4e08-a813-5fe686840c29","Type":"ContainerStarted","Data":"2aee99bb6184b12cde1d6f96e1e527eb4f866de825d447b6e3b5cd9a93ad08e0"} Apr 22 15:18:06.244510 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.244486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2l9ld" event={"ID":"672858ed-71c0-4480-881d-f921a89639d3","Type":"ContainerStarted","Data":"dc1f426901f9ee618b7ba391af0f56dd2615a1680553b16d8d2450c167978f67"} Apr 22 15:18:06.244606 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.244517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2l9ld" event={"ID":"672858ed-71c0-4480-881d-f921a89639d3","Type":"ContainerStarted","Data":"64a1ff184d5e7a294cacd1f500cb9b4e459dce218dfc961eaf4f86dca5d6cecf"} Apr 22 15:18:06.245587 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.245558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7knnb" event={"ID":"7e2980bd-3504-4803-9825-ab03e37698f6","Type":"ContainerStarted","Data":"1b970fe4e10ba19bd7cf2838a58307ea8579cc6d5735a5a62a8849256cb1abd6"} Apr 22 15:18:06.246636 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.246611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" event={"ID":"aefab03a-4e84-4f7e-9778-d0da04255bfd","Type":"ContainerDied","Data":"29caac25b73893a207a9c9332bcdd86a61a9b5e7dfb17247137acec2df5a69fb"} Apr 22 15:18:06.246747 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.246649 2577 scope.go:117] "RemoveContainer" containerID="070cfd14470b0504709ab9d5f76493b6792624cdad03b589dd4f2c4987432b6d" Apr 22 15:18:06.246747 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.246738 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fd4d896fc-ltlnc" Apr 22 15:18:06.262900 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.262875 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:18:06.269025 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:06.269004 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6fd4d896fc-ltlnc"] Apr 22 15:18:07.253976 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:07.253935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" event={"ID":"a4fb852c-ad08-434d-abea-ab07a5423921","Type":"ContainerStarted","Data":"7f2648ebda38c48ae5aa67d7985c77622cfbd64b83e6e167ee592562475ac186"} Apr 22 15:18:07.256403 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:07.256374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2l9ld" event={"ID":"672858ed-71c0-4480-881d-f921a89639d3","Type":"ContainerStarted","Data":"5fabea13ac737bfa3ee9c289c62dce91c4729b735cfb9ac441218a426b34937e"} Apr 22 15:18:07.272134 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:07.272086 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-7htkt" podStartSLOduration=1.015575302 podStartE2EDuration="2.272066849s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.813254443 +0000 UTC m=+568.599891481" lastFinishedPulling="2026-04-22 15:18:07.069745984 +0000 UTC m=+569.856383028" observedRunningTime="2026-04-22 15:18:07.268994479 +0000 UTC m=+570.055631541" watchObservedRunningTime="2026-04-22 15:18:07.272066849 +0000 UTC m=+570.058703909" Apr 22 15:18:07.805533 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:07.805498 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" path="/var/lib/kubelet/pods/aefab03a-4e84-4f7e-9778-d0da04255bfd/volumes" Apr 22 15:18:08.062056 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.061968 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qmc6p"] Apr 22 15:18:08.062321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062263 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062282 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062300 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062308 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062328 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062336 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062348 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062355 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062363 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062370 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062378 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062385 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062442 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062453 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062462 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062473 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062482 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.062648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.062617 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aefab03a-4e84-4f7e-9778-d0da04255bfd" containerName="registry" Apr 22 15:18:08.065768 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.065749 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.068682 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.068654 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 15:18:08.068781 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.068681 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 15:18:08.068781 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.068656 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-px7bc\"" Apr 22 15:18:08.069006 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.068973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:18:08.078726 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.078692 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qmc6p"] Apr 22 15:18:08.164311 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.164269 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.164481 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.164320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5r9r\" (UniqueName: \"kubernetes.io/projected/3b31439e-0cf1-40d4-844b-59c2435526a4-kube-api-access-l5r9r\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.164481 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.164402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.164586 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.164487 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b31439e-0cf1-40d4-844b-59c2435526a4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.265244 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.265210 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b31439e-0cf1-40d4-844b-59c2435526a4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.265687 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.265278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.265687 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.265304 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5r9r\" (UniqueName: \"kubernetes.io/projected/3b31439e-0cf1-40d4-844b-59c2435526a4-kube-api-access-l5r9r\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.265687 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.265341 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.266024 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.265965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b31439e-0cf1-40d4-844b-59c2435526a4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.268685 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.268656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.268920 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.268900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b31439e-0cf1-40d4-844b-59c2435526a4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.276377 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.275762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5r9r\" (UniqueName: \"kubernetes.io/projected/3b31439e-0cf1-40d4-844b-59c2435526a4-kube-api-access-l5r9r\") pod \"prometheus-operator-5676c8c784-qmc6p\" (UID: \"3b31439e-0cf1-40d4-844b-59c2435526a4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:08.384341 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:08.384267 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" Apr 22 15:18:09.620621 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:09.620588 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-qmc6p"] Apr 22 15:18:09.624478 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:09.624443 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b31439e_0cf1_40d4_844b_59c2435526a4.slice/crio-0943f5377a39e687b91caeb6dc30b0b3fed3300314e428c35795111f34adef27 WatchSource:0}: Error finding container 0943f5377a39e687b91caeb6dc30b0b3fed3300314e428c35795111f34adef27: Status 404 returned error can't find the container with id 0943f5377a39e687b91caeb6dc30b0b3fed3300314e428c35795111f34adef27 Apr 22 15:18:10.266749 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:10.266714 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758f9c8856-gpqgw" event={"ID":"19c9b964-41f6-4e08-a813-5fe686840c29","Type":"ContainerStarted","Data":"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607"} Apr 22 15:18:10.268743 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:10.268711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2l9ld" event={"ID":"672858ed-71c0-4480-881d-f921a89639d3","Type":"ContainerStarted","Data":"5a49bfd235fda616f2cbedb13669f42a1fa341e6061efe1e785ecd057dfb2503"} Apr 22 15:18:10.269893 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:10.269867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" event={"ID":"3b31439e-0cf1-40d4-844b-59c2435526a4","Type":"ContainerStarted","Data":"0943f5377a39e687b91caeb6dc30b0b3fed3300314e428c35795111f34adef27"} Apr 22 15:18:10.288806 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:10.288743 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-758f9c8856-gpqgw" podStartSLOduration=1.593811077 podStartE2EDuration="5.288726212s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.79801726 +0000 UTC m=+568.584654297" lastFinishedPulling="2026-04-22 15:18:09.49293238 +0000 UTC m=+572.279569432" observedRunningTime="2026-04-22 15:18:10.287855951 +0000 UTC m=+573.074493013" watchObservedRunningTime="2026-04-22 15:18:10.288726212 +0000 UTC m=+573.075363276" Apr 22 15:18:10.308781 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:10.308717 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2l9ld" podStartSLOduration=1.567452165 podStartE2EDuration="5.308698703s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.744491884 +0000 UTC m=+568.531128935" lastFinishedPulling="2026-04-22 15:18:09.48573843 +0000 UTC m=+572.272375473" observedRunningTime="2026-04-22 15:18:10.306327768 +0000 UTC m=+573.092964832" watchObservedRunningTime="2026-04-22 15:18:10.308698703 +0000 UTC m=+573.095335765" Apr 22 15:18:11.275459 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:11.275421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" event={"ID":"3b31439e-0cf1-40d4-844b-59c2435526a4","Type":"ContainerStarted","Data":"60379a30eb7ebbf9a6397521ba9c2ef936471caeb00e1309d5407cb2261e9fee"} Apr 22 15:18:11.275904 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:11.275468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" event={"ID":"3b31439e-0cf1-40d4-844b-59c2435526a4","Type":"ContainerStarted","Data":"83aa42fb572f1ee135773bd7fd4bd412d470cc68f2fbf3a7cdb09a4450df9913"} Apr 22 15:18:11.292745 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:11.292686 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-qmc6p" podStartSLOduration=2.111806291 podStartE2EDuration="3.292669901s" podCreationTimestamp="2026-04-22 15:18:08 +0000 UTC" firstStartedPulling="2026-04-22 15:18:09.627009464 +0000 UTC m=+572.413646501" lastFinishedPulling="2026-04-22 15:18:10.807873065 +0000 UTC m=+573.594510111" observedRunningTime="2026-04-22 15:18:11.292105864 +0000 UTC m=+574.078742926" watchObservedRunningTime="2026-04-22 15:18:11.292669901 +0000 UTC m=+574.079306962" Apr 22 15:18:13.420806 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.420767 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k"] Apr 22 15:18:13.424550 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.424527 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-75wkc"] Apr 22 15:18:13.424743 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.424713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.427029 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.427008 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 15:18:13.427695 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.427670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-7kg4p\"" Apr 22 15:18:13.427780 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.427670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 15:18:13.427780 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.427770 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.431826 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.431600 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:18:13.431826 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.431601 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-8dvjq\"" Apr 22 15:18:13.431826 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.431656 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:18:13.431826 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.431605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:18:13.435976 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.435955 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k"] Apr 22 15:18:13.510854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.510822 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-textfile\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.510854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.510860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-root\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.510881 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.511066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.510986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8j5n\" (UniqueName: \"kubernetes.io/projected/535d8fcd-432a-4fa0-a4d0-4c0c54323776-kube-api-access-g8j5n\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/572626ce-7b81-451b-b464-a73e55d35d02-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.511181 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-metrics-client-ca\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511181 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-sys\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511289 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511213 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-accelerators-collector-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511289 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv4n6\" (UniqueName: \"kubernetes.io/projected/572626ce-7b81-451b-b464-a73e55d35d02-kube-api-access-wv4n6\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.511368 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-wtmp\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511368 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511314 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-tls\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511467 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511378 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.511467 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.511402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.612273 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-metrics-client-ca\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612273 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612274 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-sys\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-accelerators-collector-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wv4n6\" (UniqueName: \"kubernetes.io/projected/572626ce-7b81-451b-b464-a73e55d35d02-kube-api-access-wv4n6\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-wtmp\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-tls\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.612452 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612448 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-textfile\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-root\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8j5n\" (UniqueName: \"kubernetes.io/projected/535d8fcd-432a-4fa0-a4d0-4c0c54323776-kube-api-access-g8j5n\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-wtmp\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/572626ce-7b81-451b-b464-a73e55d35d02-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612446 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-sys\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:13.612691 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 15:18:13.612854 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:13.612748 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls podName:572626ce-7b81-451b-b464-a73e55d35d02 nodeName:}" failed. No retries permitted until 2026-04-22 15:18:14.112728485 +0000 UTC m=+576.899365527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-l7k5k" (UID: "572626ce-7b81-451b-b464-a73e55d35d02") : secret "openshift-state-metrics-tls" not found Apr 22 15:18:13.613282 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-textfile\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.613282 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/535d8fcd-432a-4fa0-a4d0-4c0c54323776-root\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.613282 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.612978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-accelerators-collector-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.613282 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.613134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/535d8fcd-432a-4fa0-a4d0-4c0c54323776-metrics-client-ca\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.613640 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.613616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/572626ce-7b81-451b-b464-a73e55d35d02-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.615103 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.615077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-tls\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.615218 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.615089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/535d8fcd-432a-4fa0-a4d0-4c0c54323776-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.615218 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.615143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.621105 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.621077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv4n6\" (UniqueName: \"kubernetes.io/projected/572626ce-7b81-451b-b464-a73e55d35d02-kube-api-access-wv4n6\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:13.621895 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.621872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8j5n\" (UniqueName: \"kubernetes.io/projected/535d8fcd-432a-4fa0-a4d0-4c0c54323776-kube-api-access-g8j5n\") pod \"node-exporter-75wkc\" (UID: \"535d8fcd-432a-4fa0-a4d0-4c0c54323776\") " pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.745426 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:13.745330 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-75wkc" Apr 22 15:18:13.757077 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:13.757034 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535d8fcd_432a_4fa0_a4d0_4c0c54323776.slice/crio-9e29386a79a98b56ae228f978633d3e83583d52e08fe58c7409b5d1ababcc353 WatchSource:0}: Error finding container 9e29386a79a98b56ae228f978633d3e83583d52e08fe58c7409b5d1ababcc353: Status 404 returned error can't find the container with id 9e29386a79a98b56ae228f978633d3e83583d52e08fe58c7409b5d1ababcc353 Apr 22 15:18:14.117642 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.117595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:14.121071 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.121010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/572626ce-7b81-451b-b464-a73e55d35d02-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-l7k5k\" (UID: \"572626ce-7b81-451b-b464-a73e55d35d02\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:14.287711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.287679 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-75wkc" event={"ID":"535d8fcd-432a-4fa0-a4d0-4c0c54323776","Type":"ContainerStarted","Data":"9e29386a79a98b56ae228f978633d3e83583d52e08fe58c7409b5d1ababcc353"} Apr 22 15:18:14.338940 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.338903 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" Apr 22 15:18:14.476397 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.476338 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:18:14.480707 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.480685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486077 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486150 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486364 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486382 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486453 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 15:18:14.486723 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.486622 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-4flh2\"" Apr 22 15:18:14.488586 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.488051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 15:18:14.488924 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.488900 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 15:18:14.489430 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.489109 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 15:18:14.498576 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.498554 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:18:14.586653 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.586610 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k"] Apr 22 15:18:14.590653 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:14.590354 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572626ce_7b81_451b_b464_a73e55d35d02.slice/crio-511f966ebbb60c96a31832ff941739c0ea2bb0f975b9f5deb16b04ca0c9a66f1 WatchSource:0}: Error finding container 511f966ebbb60c96a31832ff941739c0ea2bb0f975b9f5deb16b04ca0c9a66f1: Status 404 returned error can't find the container with id 511f966ebbb60c96a31832ff941739c0ea2bb0f975b9f5deb16b04ca0c9a66f1 Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-out\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-web-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.622791 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.623310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.623310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622873 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8xv\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-kube-api-access-ws8xv\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.623310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.622906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724102 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724102 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724143 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724174 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724278 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724314 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:14.724304 2577 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-out\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-web-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:14.724375 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls podName:8ef4aadb-7584-494f-b7ec-96fed3eaee8d nodeName:}" failed. No retries permitted until 2026-04-22 15:18:15.224353404 +0000 UTC m=+578.010990466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "8ef4aadb-7584-494f-b7ec-96fed3eaee8d") : secret "alertmanager-main-tls" not found Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.724650 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.724513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8xv\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-kube-api-access-ws8xv\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.730246 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.730221 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.730246 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.730237 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-out\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.730490 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.730465 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.730595 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.730574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.730928 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:14.730693 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle podName:8ef4aadb-7584-494f-b7ec-96fed3eaee8d nodeName:}" failed. No retries permitted until 2026-04-22 15:18:15.230675117 +0000 UTC m=+578.017312163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8ef4aadb-7584-494f-b7ec-96fed3eaee8d") : configmap references non-existent config key: ca-bundle.crt Apr 22 15:18:14.731048 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.731021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.731322 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.731277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.731858 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.731819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.732170 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.732125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.733093 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.733035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-config-volume\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.735054 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.734945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-web-config\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:14.736709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:14.736688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8xv\" (UniqueName: \"kubernetes.io/projected/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-kube-api-access-ws8xv\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.229796 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.229752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.232963 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.232934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.292944 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.292896 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" event={"ID":"572626ce-7b81-451b-b464-a73e55d35d02","Type":"ContainerStarted","Data":"33e39ba4ed5a5747cbcfb43d1be4b370f1d2f029f32e5bcb0c2227a6bd345484"} Apr 22 15:18:15.293110 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.292952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" event={"ID":"572626ce-7b81-451b-b464-a73e55d35d02","Type":"ContainerStarted","Data":"9903d4d5506198e82e948e121dd944e11e220bd1c70ffea4ccb94f4bab9982ff"} Apr 22 15:18:15.293110 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.292968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" event={"ID":"572626ce-7b81-451b-b464-a73e55d35d02","Type":"ContainerStarted","Data":"511f966ebbb60c96a31832ff941739c0ea2bb0f975b9f5deb16b04ca0c9a66f1"} Apr 22 15:18:15.294344 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.294312 2577 generic.go:358] "Generic (PLEG): container finished" podID="535d8fcd-432a-4fa0-a4d0-4c0c54323776" containerID="165cd0d58addfeab1e22b3aff3b42c4258bd9b7658966adf4e4ee8d3076c901b" exitCode=0 Apr 22 15:18:15.294471 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.294402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-75wkc" event={"ID":"535d8fcd-432a-4fa0-a4d0-4c0c54323776","Type":"ContainerDied","Data":"165cd0d58addfeab1e22b3aff3b42c4258bd9b7658966adf4e4ee8d3076c901b"} Apr 22 15:18:15.330420 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.330390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.331099 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.331080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ef4aadb-7584-494f-b7ec-96fed3eaee8d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8ef4aadb-7584-494f-b7ec-96fed3eaee8d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.400372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.400343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 15:18:15.640458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.640331 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:15.640458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.640399 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:15.647095 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.647072 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:15.789225 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:15.789182 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 15:18:15.790709 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:15.790683 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ef4aadb_7584_494f_b7ec_96fed3eaee8d.slice/crio-f20cc781c9339a27a246d81ce682d9c06ed6ae200cdf46ffca5ccc261401d8de WatchSource:0}: Error finding container f20cc781c9339a27a246d81ce682d9c06ed6ae200cdf46ffca5ccc261401d8de: Status 404 returned error can't find the container with id f20cc781c9339a27a246d81ce682d9c06ed6ae200cdf46ffca5ccc261401d8de Apr 22 15:18:16.300482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.300412 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" event={"ID":"572626ce-7b81-451b-b464-a73e55d35d02","Type":"ContainerStarted","Data":"6c6d37cd7908f5f37a9cdaef7af2099828461ca6c6c245c57ac34f5972fe2216"} Apr 22 15:18:16.302923 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.302871 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-75wkc" event={"ID":"535d8fcd-432a-4fa0-a4d0-4c0c54323776","Type":"ContainerStarted","Data":"9546d2f048ae72588d5091b42eddb60784d8cbf0d891324fdf1dbc751bafeed6"} Apr 22 15:18:16.302923 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.302905 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-75wkc" event={"ID":"535d8fcd-432a-4fa0-a4d0-4c0c54323776","Type":"ContainerStarted","Data":"cfe1dca67f70f65fcbdad9d35e578d2dfda4480e0b18294845536c40a6ea9f8d"} Apr 22 15:18:16.304333 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.304309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"f20cc781c9339a27a246d81ce682d9c06ed6ae200cdf46ffca5ccc261401d8de"} Apr 22 15:18:16.310655 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.310621 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:16.318906 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.318821 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-l7k5k" podStartSLOduration=2.439453644 podStartE2EDuration="3.318804222s" podCreationTimestamp="2026-04-22 15:18:13 +0000 UTC" firstStartedPulling="2026-04-22 15:18:14.784882547 +0000 UTC m=+577.571519592" lastFinishedPulling="2026-04-22 15:18:15.664233118 +0000 UTC m=+578.450870170" observedRunningTime="2026-04-22 15:18:16.317265538 +0000 UTC m=+579.103902605" watchObservedRunningTime="2026-04-22 15:18:16.318804222 +0000 UTC m=+579.105441283" Apr 22 15:18:16.365782 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:16.365723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-75wkc" podStartSLOduration=2.634613296 podStartE2EDuration="3.36570332s" podCreationTimestamp="2026-04-22 15:18:13 +0000 UTC" firstStartedPulling="2026-04-22 15:18:13.759125998 +0000 UTC m=+576.545763039" lastFinishedPulling="2026-04-22 15:18:14.490216008 +0000 UTC m=+577.276853063" observedRunningTime="2026-04-22 15:18:16.34298234 +0000 UTC m=+579.129619420" watchObservedRunningTime="2026-04-22 15:18:16.36570332 +0000 UTC m=+579.152340380" Apr 22 15:18:17.308209 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:17.308156 2577 generic.go:358] "Generic (PLEG): container finished" podID="8ef4aadb-7584-494f-b7ec-96fed3eaee8d" containerID="980a19051843d67a4b27624f34a7073dda0948ce37fe1407ab667a26325a1c6f" exitCode=0 Apr 22 15:18:17.308595 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:17.308282 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerDied","Data":"980a19051843d67a4b27624f34a7073dda0948ce37fe1407ab667a26325a1c6f"} Apr 22 15:18:18.224012 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.223960 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:18:18.229081 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.227666 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr"] Apr 22 15:18:18.232692 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.232664 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:18.232830 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.232672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.236628 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.236604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 15:18:18.236740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.236691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5cknt\"" Apr 22 15:18:18.237449 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.237425 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr"] Apr 22 15:18:18.242041 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.241943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:18:18.362249 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362286 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362309 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b90d9259-ac80-4f8e-a1d4-34cc591dc37f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9cgr\" (UID: \"b90d9259-ac80-4f8e-a1d4-34cc591dc37f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:18.362684 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.362566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmntp\" (UniqueName: \"kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.463949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464056 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b90d9259-ac80-4f8e-a1d4-34cc591dc37f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9cgr\" (UID: \"b90d9259-ac80-4f8e-a1d4-34cc591dc37f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmntp\" (UniqueName: \"kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464329 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.464477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.464367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.465949 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.465898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.466106 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.466077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.466277 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.466232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.467136 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.466690 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.467136 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.467110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.467842 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.467822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b90d9259-ac80-4f8e-a1d4-34cc591dc37f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-r9cgr\" (UID: \"b90d9259-ac80-4f8e-a1d4-34cc591dc37f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:18.467954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.467935 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.473722 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.473700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmntp\" (UniqueName: \"kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp\") pod \"console-74748b6745-5hk4w\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:18.549627 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.549573 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:18.555899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:18.555867 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:19.659648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.659236 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:18:19.665734 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.665700 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.668564 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668450 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 15:18:19.668564 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 15:18:19.668564 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668513 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 15:18:19.668785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668514 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7f9nk2qgmde3q\"" Apr 22 15:18:19.668785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668659 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 15:18:19.668785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668462 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 15:18:19.668785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668761 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 15:18:19.668961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668933 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 15:18:19.668961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668950 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-c8p4l\"" Apr 22 15:18:19.669066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.668989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 15:18:19.669066 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.669037 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:18:19.669168 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.669124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 15:18:19.669168 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.669134 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 15:18:19.669487 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.669467 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 15:18:19.675038 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.673274 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 15:18:19.675456 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.675422 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:18:19.775940 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.775908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.775940 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.775948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776117 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776297 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776410 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776541 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776840 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776547 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776840 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776840 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776637 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.776840 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.776679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfhpx\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-kube-api-access-dfhpx\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.877944 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.877908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.877944 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.877944 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.878155 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.877963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.878155 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.877988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.878155 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.878113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.879815 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.879741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.878172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880134 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfhpx\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-kube-api-access-dfhpx\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.880696 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880321 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.881114 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.880747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.883302 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.882471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.883402 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.881781 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.886890 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.886766 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.888881 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.888850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.888969 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.888909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.889034 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.888970 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.889099 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.889023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.889899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.889870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.890730 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.890708 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.890866 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.890778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.891667 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.891640 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.892002 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.891984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.892557 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.892537 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.892788 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.892763 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.894549 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.894522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.894921 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.894889 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.895297 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.895235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.895845 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.895794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.896418 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.896361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.898201 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.898172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.899908 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.899887 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfhpx\" (UniqueName: \"kubernetes.io/projected/fa60e528-1e5f-41c3-a1bf-51d4e6f07f88-kube-api-access-dfhpx\") pod \"prometheus-k8s-0\" (UID: \"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:19.979803 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:19.979704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:25.557216 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:25.557153 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:18:25.558247 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:25.558223 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa60e528_1e5f_41c3_a1bf_51d4e6f07f88.slice/crio-20d62e811eea0d77980c016d24660d8745e4d74ff30311c22365edc17fb160ae WatchSource:0}: Error finding container 20d62e811eea0d77980c016d24660d8745e4d74ff30311c22365edc17fb160ae: Status 404 returned error can't find the container with id 20d62e811eea0d77980c016d24660d8745e4d74ff30311c22365edc17fb160ae Apr 22 15:18:25.772785 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:25.772691 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:18:25.777573 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:25.777539 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d4dee4_992e_45c1_b58a_1a40a83b7c8c.slice/crio-60090e468330cf46f66e42f4fb5bcc5b8d03784c9dcf66440e6ed1b4a29f72c1 WatchSource:0}: Error finding container 60090e468330cf46f66e42f4fb5bcc5b8d03784c9dcf66440e6ed1b4a29f72c1: Status 404 returned error can't find the container with id 60090e468330cf46f66e42f4fb5bcc5b8d03784c9dcf66440e6ed1b4a29f72c1 Apr 22 15:18:25.778708 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:25.778660 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr"] Apr 22 15:18:25.780639 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:18:25.780613 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90d9259_ac80_4f8e_a1d4_34cc591dc37f.slice/crio-b8bb83fff64558361e8cec073840c1a0aaa7af38c718a1db48e75955de715ce9 WatchSource:0}: Error finding container b8bb83fff64558361e8cec073840c1a0aaa7af38c718a1db48e75955de715ce9: Status 404 returned error can't find the container with id b8bb83fff64558361e8cec073840c1a0aaa7af38c718a1db48e75955de715ce9 Apr 22 15:18:26.345213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.345083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"c454163939af319c4c0ce488077116a5c5e44ade4251f7d124e0ec3e110555ec"} Apr 22 15:18:26.345213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.345128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"8aa61a987c9d0e00029b9afa87fc9a9f269a1536fbe489d206feea5f9541117a"} Apr 22 15:18:26.345213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.345142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"0706e108dbb3664d3dc3830650b4265a3799922bb9b805f4322b2d7c727c0b4f"} Apr 22 15:18:26.345213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.345154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"e70935934af5cf1b8384d33c6ebe2a28e24fc629f7b5fc925915e7e996f7a52c"} Apr 22 15:18:26.345213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.345165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"2158f010477486751b377b8b321ba8df777c1cb314fd811ff8720743753023c1"} Apr 22 15:18:26.347726 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.347663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74748b6745-5hk4w" event={"ID":"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c","Type":"ContainerStarted","Data":"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34"} Apr 22 15:18:26.347726 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.347698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74748b6745-5hk4w" event={"ID":"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c","Type":"ContainerStarted","Data":"60090e468330cf46f66e42f4fb5bcc5b8d03784c9dcf66440e6ed1b4a29f72c1"} Apr 22 15:18:26.349846 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.349803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7knnb" event={"ID":"7e2980bd-3504-4803-9825-ab03e37698f6","Type":"ContainerStarted","Data":"ccd40717cd8b9342f71f1411095072509396f628619649918ad4a2aa8739dd60"} Apr 22 15:18:26.350375 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.350314 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:26.353308 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.353280 2577 generic.go:358] "Generic (PLEG): container finished" podID="fa60e528-1e5f-41c3-a1bf-51d4e6f07f88" containerID="592414b19b85dc5d02806427f26974019efdc83c7806a4cddf4a079de87ce0aa" exitCode=0 Apr 22 15:18:26.353428 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.353354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerDied","Data":"592414b19b85dc5d02806427f26974019efdc83c7806a4cddf4a079de87ce0aa"} Apr 22 15:18:26.353428 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.353377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"20d62e811eea0d77980c016d24660d8745e4d74ff30311c22365edc17fb160ae"} Apr 22 15:18:26.354976 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.354920 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" event={"ID":"b90d9259-ac80-4f8e-a1d4-34cc591dc37f","Type":"ContainerStarted","Data":"b8bb83fff64558361e8cec073840c1a0aaa7af38c718a1db48e75955de715ce9"} Apr 22 15:18:26.361862 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.361815 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7knnb" Apr 22 15:18:26.366952 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.366735 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74748b6745-5hk4w" podStartSLOduration=8.36672109 podStartE2EDuration="8.36672109s" podCreationTimestamp="2026-04-22 15:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:18:26.36593175 +0000 UTC m=+589.152568811" watchObservedRunningTime="2026-04-22 15:18:26.36672109 +0000 UTC m=+589.153358151" Apr 22 15:18:26.383663 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:26.383439 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7knnb" podStartSLOduration=1.7376320779999999 podStartE2EDuration="21.383422911s" podCreationTimestamp="2026-04-22 15:18:05 +0000 UTC" firstStartedPulling="2026-04-22 15:18:05.777431427 +0000 UTC m=+568.564068466" lastFinishedPulling="2026-04-22 15:18:25.423222253 +0000 UTC m=+588.209859299" observedRunningTime="2026-04-22 15:18:26.381664294 +0000 UTC m=+589.168301355" watchObservedRunningTime="2026-04-22 15:18:26.383422911 +0000 UTC m=+589.170059974" Apr 22 15:18:27.365743 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:27.365605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8ef4aadb-7584-494f-b7ec-96fed3eaee8d","Type":"ContainerStarted","Data":"ecb466b6b616e2e35708e70670b5f974f62fe79ecd895daea3f1f90349e72cba"} Apr 22 15:18:27.396484 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:27.396430 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.566146921 podStartE2EDuration="13.396414618s" podCreationTimestamp="2026-04-22 15:18:14 +0000 UTC" firstStartedPulling="2026-04-22 15:18:15.793134383 +0000 UTC m=+578.579771426" lastFinishedPulling="2026-04-22 15:18:26.623402083 +0000 UTC m=+589.410039123" observedRunningTime="2026-04-22 15:18:27.393766802 +0000 UTC m=+590.180403865" watchObservedRunningTime="2026-04-22 15:18:27.396414618 +0000 UTC m=+590.183051680" Apr 22 15:18:28.370584 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.370539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" event={"ID":"b90d9259-ac80-4f8e-a1d4-34cc591dc37f","Type":"ContainerStarted","Data":"21f316223c96bb5f2d601516e8a1ba84133c7219e6c3434382797a81c1542d6d"} Apr 22 15:18:28.371639 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.371601 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:28.378313 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.378288 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" Apr 22 15:18:28.410944 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.410888 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-r9cgr" podStartSLOduration=8.19698512 podStartE2EDuration="10.410870405s" podCreationTimestamp="2026-04-22 15:18:18 +0000 UTC" firstStartedPulling="2026-04-22 15:18:25.782764157 +0000 UTC m=+588.569401209" lastFinishedPulling="2026-04-22 15:18:27.996649445 +0000 UTC m=+590.783286494" observedRunningTime="2026-04-22 15:18:28.40863916 +0000 UTC m=+591.195276221" watchObservedRunningTime="2026-04-22 15:18:28.410870405 +0000 UTC m=+591.197507465" Apr 22 15:18:28.556888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.556096 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:28.556888 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.556155 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:28.562187 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:28.562160 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:29.377772 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:29.377738 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:18:29.424739 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:29.424707 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:30.378590 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:30.378508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"d3562d6386abbc1df695382b98c817cda3723d45b8ed58ac2bc1d8e7e39e1a35"} Apr 22 15:18:30.378590 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:30.378552 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"fb4b7396737809f38a8af3dd322c364d13849e91653b5250e2ef56cc35421dac"} Apr 22 15:18:33.391458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:33.391369 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"0457ec7a7d3d6bec5215b6ba0b33f286a88cf3165c04f29ac7cee70162912f66"} Apr 22 15:18:33.391458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:33.391426 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"f18bb3a6fdacbbf5feb26c2ed515311a7914c6c5dbd45a14470b5e1fb440a004"} Apr 22 15:18:33.391458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:33.391442 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"70753565f505a91106fb74b21d635e2221e1b54d59727e43c7c920126f56a650"} Apr 22 15:18:33.391458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:33.391458 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa60e528-1e5f-41c3-a1bf-51d4e6f07f88","Type":"ContainerStarted","Data":"a676cbb334bd3ae8d5eb82521723c20865234df816c0c6cf8ba669c1f05e3002"} Apr 22 15:18:33.420606 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:33.420558 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=8.274873645 podStartE2EDuration="14.420542663s" podCreationTimestamp="2026-04-22 15:18:19 +0000 UTC" firstStartedPulling="2026-04-22 15:18:26.35503029 +0000 UTC m=+589.141667342" lastFinishedPulling="2026-04-22 15:18:32.500699323 +0000 UTC m=+595.287336360" observedRunningTime="2026-04-22 15:18:33.419514272 +0000 UTC m=+596.206151369" watchObservedRunningTime="2026-04-22 15:18:33.420542663 +0000 UTC m=+596.207179726" Apr 22 15:18:34.980293 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:34.980253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:18:37.717112 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:37.717088 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:18:37.717477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:37.717187 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:18:37.719897 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:37.719876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:18:37.720041 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:37.719945 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:18:54.448321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.448258 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-758f9c8856-gpqgw" podUID="19c9b964-41f6-4e08-a813-5fe686840c29" containerName="console" containerID="cri-o://fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607" gracePeriod=15 Apr 22 15:18:54.704244 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.704174 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758f9c8856-gpqgw_19c9b964-41f6-4e08-a813-5fe686840c29/console/0.log" Apr 22 15:18:54.704352 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.704247 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:54.813961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.813929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.813961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.813959 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814146 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.813994 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814146 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814014 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814146 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814038 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psxf9\" (UniqueName: \"kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814146 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814108 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814372 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814152 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca\") pod \"19c9b964-41f6-4e08-a813-5fe686840c29\" (UID: \"19c9b964-41f6-4e08-a813-5fe686840c29\") " Apr 22 15:18:54.814461 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814427 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:54.814461 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814441 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config" (OuterVolumeSpecName: "console-config") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:54.814461 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814451 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:54.814770 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.814746 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca" (OuterVolumeSpecName: "service-ca") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:18:54.816304 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.816271 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9" (OuterVolumeSpecName: "kube-api-access-psxf9") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "kube-api-access-psxf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:18:54.816390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.816295 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:18:54.816390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.816363 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "19c9b964-41f6-4e08-a813-5fe686840c29" (UID: "19c9b964-41f6-4e08-a813-5fe686840c29"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:18:54.915008 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.914977 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915008 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915002 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-console-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915008 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915013 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-psxf9\" (UniqueName: \"kubernetes.io/projected/19c9b964-41f6-4e08-a813-5fe686840c29-kube-api-access-psxf9\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915022 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19c9b964-41f6-4e08-a813-5fe686840c29-console-oauth-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915032 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-service-ca\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915040 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-trusted-ca-bundle\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:54.915281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:54.915048 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19c9b964-41f6-4e08-a813-5fe686840c29-oauth-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:18:55.466943 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.466919 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-758f9c8856-gpqgw_19c9b964-41f6-4e08-a813-5fe686840c29/console/0.log" Apr 22 15:18:55.467346 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.466956 2577 generic.go:358] "Generic (PLEG): container finished" podID="19c9b964-41f6-4e08-a813-5fe686840c29" containerID="fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607" exitCode=2 Apr 22 15:18:55.467346 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.466987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758f9c8856-gpqgw" event={"ID":"19c9b964-41f6-4e08-a813-5fe686840c29","Type":"ContainerDied","Data":"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607"} Apr 22 15:18:55.467346 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.467027 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-758f9c8856-gpqgw" event={"ID":"19c9b964-41f6-4e08-a813-5fe686840c29","Type":"ContainerDied","Data":"2aee99bb6184b12cde1d6f96e1e527eb4f866de825d447b6e3b5cd9a93ad08e0"} Apr 22 15:18:55.467346 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.467027 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-758f9c8856-gpqgw" Apr 22 15:18:55.467346 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.467095 2577 scope.go:117] "RemoveContainer" containerID="fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607" Apr 22 15:18:55.475302 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.475269 2577 scope.go:117] "RemoveContainer" containerID="fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607" Apr 22 15:18:55.475582 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:18:55.475561 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607\": container with ID starting with fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607 not found: ID does not exist" containerID="fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607" Apr 22 15:18:55.475693 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.475592 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607"} err="failed to get container status \"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607\": rpc error: code = NotFound desc = could not find container \"fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607\": container with ID starting with fbf41ff6eb9ba0f31445c99138f64d0604663271d414501c2c14223e2d142607 not found: ID does not exist" Apr 22 15:18:55.490580 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.490556 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:55.493527 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.493507 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-758f9c8856-gpqgw"] Apr 22 15:18:55.803530 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:18:55.803501 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19c9b964-41f6-4e08-a813-5fe686840c29" path="/var/lib/kubelet/pods/19c9b964-41f6-4e08-a813-5fe686840c29/volumes" Apr 22 15:19:19.980709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:19.980675 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:19:19.999327 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:19.999300 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:19:20.557444 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:20.557417 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:19:37.768320 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.768286 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5f5f55ddc7-66h44"] Apr 22 15:19:37.768766 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.768662 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19c9b964-41f6-4e08-a813-5fe686840c29" containerName="console" Apr 22 15:19:37.768766 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.768675 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="19c9b964-41f6-4e08-a813-5fe686840c29" containerName="console" Apr 22 15:19:37.768766 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.768760 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="19c9b964-41f6-4e08-a813-5fe686840c29" containerName="console" Apr 22 15:19:37.773561 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.773535 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.777685 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.777656 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 15:19:37.778013 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.777996 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 15:19:37.778554 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.778533 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 15:19:37.778666 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.778565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 15:19:37.778666 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.778629 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 15:19:37.783268 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.783248 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 15:19:37.783620 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.783606 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-2wb9v\"" Apr 22 15:19:37.792182 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.792160 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f5f55ddc7-66h44"] Apr 22 15:19:37.856551 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856521 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-metrics-client-ca\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856640 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf6j8\" (UniqueName: \"kubernetes.io/projected/e93874f8-9ce0-4a04-8e9c-8ae407239e13-kube-api-access-tf6j8\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-federate-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856880 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.856880 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.856775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-serving-certs-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957507 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf6j8\" (UniqueName: \"kubernetes.io/projected/e93874f8-9ce0-4a04-8e9c-8ae407239e13-kube-api-access-tf6j8\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957507 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-federate-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957538 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-serving-certs-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.957744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.958004 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957801 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-metrics-client-ca\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.958004 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.957846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.958519 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.958496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-metrics-client-ca\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.958619 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.958497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-serving-certs-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.958665 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.958649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.960504 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.960476 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.960591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.960520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-federate-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.960591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.960543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.960659 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.960628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/e93874f8-9ce0-4a04-8e9c-8ae407239e13-telemeter-client-tls\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:37.965654 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:37.965630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf6j8\" (UniqueName: \"kubernetes.io/projected/e93874f8-9ce0-4a04-8e9c-8ae407239e13-kube-api-access-tf6j8\") pod \"telemeter-client-5f5f55ddc7-66h44\" (UID: \"e93874f8-9ce0-4a04-8e9c-8ae407239e13\") " pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:38.083765 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:38.083738 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" Apr 22 15:19:38.207255 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:38.207231 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5f5f55ddc7-66h44"] Apr 22 15:19:38.209517 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:19:38.209489 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93874f8_9ce0_4a04_8e9c_8ae407239e13.slice/crio-af5c87528c6eb5a7188a67a6f821dc5219f3a4fdeb7a3c94ae8708e012017a91 WatchSource:0}: Error finding container af5c87528c6eb5a7188a67a6f821dc5219f3a4fdeb7a3c94ae8708e012017a91: Status 404 returned error can't find the container with id af5c87528c6eb5a7188a67a6f821dc5219f3a4fdeb7a3c94ae8708e012017a91 Apr 22 15:19:38.211284 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:38.211268 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:19:38.599486 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:38.599450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" event={"ID":"e93874f8-9ce0-4a04-8e9c-8ae407239e13","Type":"ContainerStarted","Data":"af5c87528c6eb5a7188a67a6f821dc5219f3a4fdeb7a3c94ae8708e012017a91"} Apr 22 15:19:40.607489 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:40.607454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" event={"ID":"e93874f8-9ce0-4a04-8e9c-8ae407239e13","Type":"ContainerStarted","Data":"6bc3ddf6c485c4e3932f89c0ee1d7fdf8eaa1036e2f5b29065a212338875721e"} Apr 22 15:19:40.607489 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:40.607495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" event={"ID":"e93874f8-9ce0-4a04-8e9c-8ae407239e13","Type":"ContainerStarted","Data":"38920a8de954ce05861c2dfc9b0b6a54e72ca777084db6a9a25940e16f34c6dc"} Apr 22 15:19:40.607910 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:40.607508 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" event={"ID":"e93874f8-9ce0-4a04-8e9c-8ae407239e13","Type":"ContainerStarted","Data":"2ebacba0fc8dcb8882362ff70f2a431a6186ca345cf27290b94d31a9d4ee2811"} Apr 22 15:19:40.629310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:40.629258 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5f5f55ddc7-66h44" podStartSLOduration=2.151866912 podStartE2EDuration="3.629241865s" podCreationTimestamp="2026-04-22 15:19:37 +0000 UTC" firstStartedPulling="2026-04-22 15:19:38.211392494 +0000 UTC m=+660.998029531" lastFinishedPulling="2026-04-22 15:19:39.688767431 +0000 UTC m=+662.475404484" observedRunningTime="2026-04-22 15:19:40.628492734 +0000 UTC m=+663.415129806" watchObservedRunningTime="2026-04-22 15:19:40.629241865 +0000 UTC m=+663.415878930" Apr 22 15:19:41.523343 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.523310 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:19:41.527189 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.527164 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.540519 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.540262 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:19:41.588789 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.588954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.588954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwwwm\" (UniqueName: \"kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.588954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.588954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.588954 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.589142 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.588991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690103 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwwwm\" (UniqueName: \"kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.690516 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.690495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.691057 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.691025 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.691480 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.691450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.691598 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.691450 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.691598 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.691516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.692867 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.692847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.693080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.693053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.699853 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.699831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwwwm\" (UniqueName: \"kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm\") pod \"console-755cd4b745-k4bj5\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.843351 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.843264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:41.964567 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:41.964543 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:19:41.966956 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:19:41.966925 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c8dae8_4c48_4154_aa6d_8b1ba62db900.slice/crio-a1be000ae774045db378da386edee88346824da9fd0c333f66e611d45ee93b7f WatchSource:0}: Error finding container a1be000ae774045db378da386edee88346824da9fd0c333f66e611d45ee93b7f: Status 404 returned error can't find the container with id a1be000ae774045db378da386edee88346824da9fd0c333f66e611d45ee93b7f Apr 22 15:19:42.614981 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:42.614937 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-755cd4b745-k4bj5" event={"ID":"a9c8dae8-4c48-4154-aa6d-8b1ba62db900","Type":"ContainerStarted","Data":"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c"} Apr 22 15:19:42.614981 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:42.614980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-755cd4b745-k4bj5" event={"ID":"a9c8dae8-4c48-4154-aa6d-8b1ba62db900","Type":"ContainerStarted","Data":"a1be000ae774045db378da386edee88346824da9fd0c333f66e611d45ee93b7f"} Apr 22 15:19:42.634339 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:42.634294 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-755cd4b745-k4bj5" podStartSLOduration=1.634278348 podStartE2EDuration="1.634278348s" podCreationTimestamp="2026-04-22 15:19:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:19:42.632588712 +0000 UTC m=+665.419225782" watchObservedRunningTime="2026-04-22 15:19:42.634278348 +0000 UTC m=+665.420915409" Apr 22 15:19:51.445576 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.445544 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:19:51.473601 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.473567 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:19:51.476844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.476824 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.485825 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.485803 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:19:51.575976 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.575941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.575981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.576054 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.576086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhcrx\" (UniqueName: \"kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.576110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576355 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.576147 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.576355 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.576273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677388 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhcrx\" (UniqueName: \"kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677512 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.677819 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.677600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.678327 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.678299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.678427 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.678309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.678427 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.678338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.678427 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.678423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.679869 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.679848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.679948 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.679928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.685855 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.685831 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhcrx\" (UniqueName: \"kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx\") pod \"console-5575fcffc4-cjbgc\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.786958 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.786928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:19:51.843405 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.843370 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:19:51.910643 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:51.910620 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:19:51.913303 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:19:51.913274 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df4cf41_e104_4b61_9f61_fa1efa52dba2.slice/crio-dac5f0cf850983e913c87b9ff3e6bfaff1d9f1a73b5756bfaf245a9911fd5400 WatchSource:0}: Error finding container dac5f0cf850983e913c87b9ff3e6bfaff1d9f1a73b5756bfaf245a9911fd5400: Status 404 returned error can't find the container with id dac5f0cf850983e913c87b9ff3e6bfaff1d9f1a73b5756bfaf245a9911fd5400 Apr 22 15:19:52.647141 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:52.647102 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5575fcffc4-cjbgc" event={"ID":"8df4cf41-e104-4b61-9f61-fa1efa52dba2","Type":"ContainerStarted","Data":"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74"} Apr 22 15:19:52.647141 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:52.647142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5575fcffc4-cjbgc" event={"ID":"8df4cf41-e104-4b61-9f61-fa1efa52dba2","Type":"ContainerStarted","Data":"dac5f0cf850983e913c87b9ff3e6bfaff1d9f1a73b5756bfaf245a9911fd5400"} Apr 22 15:19:52.665530 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:19:52.665482 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5575fcffc4-cjbgc" podStartSLOduration=1.6654655219999999 podStartE2EDuration="1.665465522s" podCreationTimestamp="2026-04-22 15:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:19:52.663370872 +0000 UTC m=+675.450007931" watchObservedRunningTime="2026-04-22 15:19:52.665465522 +0000 UTC m=+675.452102583" Apr 22 15:20:01.787153 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:01.787114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:20:01.787545 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:01.787262 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:20:01.791971 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:01.791949 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:20:02.683592 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:02.683562 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:20:02.735896 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:02.735860 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:20:16.464767 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.464701 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-755cd4b745-k4bj5" podUID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" containerName="console" containerID="cri-o://16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c" gracePeriod=15 Apr 22 15:20:16.694978 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.694955 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-755cd4b745-k4bj5_a9c8dae8-4c48-4154-aa6d-8b1ba62db900/console/0.log" Apr 22 15:20:16.695080 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.695012 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:20:16.726508 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726451 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-755cd4b745-k4bj5_a9c8dae8-4c48-4154-aa6d-8b1ba62db900/console/0.log" Apr 22 15:20:16.726508 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726499 2577 generic.go:358] "Generic (PLEG): container finished" podID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" containerID="16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c" exitCode=2 Apr 22 15:20:16.726655 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-755cd4b745-k4bj5" event={"ID":"a9c8dae8-4c48-4154-aa6d-8b1ba62db900","Type":"ContainerDied","Data":"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c"} Apr 22 15:20:16.726655 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726555 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-755cd4b745-k4bj5" Apr 22 15:20:16.726655 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-755cd4b745-k4bj5" event={"ID":"a9c8dae8-4c48-4154-aa6d-8b1ba62db900","Type":"ContainerDied","Data":"a1be000ae774045db378da386edee88346824da9fd0c333f66e611d45ee93b7f"} Apr 22 15:20:16.726655 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.726592 2577 scope.go:117] "RemoveContainer" containerID="16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c" Apr 22 15:20:16.734399 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.734381 2577 scope.go:117] "RemoveContainer" containerID="16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c" Apr 22 15:20:16.734643 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:20:16.734624 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c\": container with ID starting with 16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c not found: ID does not exist" containerID="16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c" Apr 22 15:20:16.734707 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.734650 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c"} err="failed to get container status \"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c\": rpc error: code = NotFound desc = could not find container \"16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c\": container with ID starting with 16cc8256f18843443885490cec3abf22199155361cffcea12405805f0bcf223c not found: ID does not exist" Apr 22 15:20:16.811670 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811642 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811771 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811698 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811771 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811771 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811736 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811771 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811766 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwwwm\" (UniqueName: \"kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811971 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811792 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.811971 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.811828 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert\") pod \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\" (UID: \"a9c8dae8-4c48-4154-aa6d-8b1ba62db900\") " Apr 22 15:20:16.812130 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.812103 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config" (OuterVolumeSpecName: "console-config") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:16.812310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.812286 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca" (OuterVolumeSpecName: "service-ca") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:16.812310 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.812298 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:16.812423 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.812307 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:16.813897 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.813872 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm" (OuterVolumeSpecName: "kube-api-access-cwwwm") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "kube-api-access-cwwwm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:20:16.814390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.814363 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:20:16.814390 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.814375 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a9c8dae8-4c48-4154-aa6d-8b1ba62db900" (UID: "a9c8dae8-4c48-4154-aa6d-8b1ba62db900"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:20:16.912900 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912874 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-service-ca\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.912900 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912897 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-trusted-ca-bundle\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.913040 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912908 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-oauth-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.913040 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912917 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cwwwm\" (UniqueName: \"kubernetes.io/projected/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-kube-api-access-cwwwm\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.913040 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912926 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-oauth-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.913040 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912936 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:16.913040 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:16.912945 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9c8dae8-4c48-4154-aa6d-8b1ba62db900-console-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:17.048575 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:17.048547 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:20:17.052309 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:17.052286 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-755cd4b745-k4bj5"] Apr 22 15:20:17.803217 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:17.803168 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" path="/var/lib/kubelet/pods/a9c8dae8-4c48-4154-aa6d-8b1ba62db900/volumes" Apr 22 15:20:27.762051 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:27.762009 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74748b6745-5hk4w" podUID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" containerName="console" containerID="cri-o://209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34" gracePeriod=15 Apr 22 15:20:27.998796 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:27.998774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74748b6745-5hk4w_d5d4dee4-992e-45c1-b58a-1a40a83b7c8c/console/0.log" Apr 22 15:20:27.998917 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:27.998834 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:20:28.106907 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.106861 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.106922 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.106963 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.106993 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmntp\" (UniqueName: \"kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107041 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107068 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107373 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107101 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config\") pod \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\" (UID: \"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c\") " Apr 22 15:20:28.107421 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107384 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:28.107497 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107474 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca" (OuterVolumeSpecName: "service-ca") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:28.107560 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107486 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:28.107560 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.107542 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config" (OuterVolumeSpecName: "console-config") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:20:28.109185 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.109158 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp" (OuterVolumeSpecName: "kube-api-access-bmntp") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "kube-api-access-bmntp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:20:28.109185 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.109169 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:20:28.109368 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.109185 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" (UID: "d5d4dee4-992e-45c1-b58a-1a40a83b7c8c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:20:28.208326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208288 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-service-ca\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208323 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-oauth-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208333 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmntp\" (UniqueName: \"kubernetes.io/projected/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-kube-api-access-bmntp\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208326 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208343 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208559 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208353 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-trusted-ca-bundle\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208559 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208362 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-oauth-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.208559 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.208371 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c-console-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:20:28.772594 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74748b6745-5hk4w_d5d4dee4-992e-45c1-b58a-1a40a83b7c8c/console/0.log" Apr 22 15:20:28.772961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772615 2577 generic.go:358] "Generic (PLEG): container finished" podID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" containerID="209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34" exitCode=2 Apr 22 15:20:28.772961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74748b6745-5hk4w" event={"ID":"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c","Type":"ContainerDied","Data":"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34"} Apr 22 15:20:28.772961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74748b6745-5hk4w" event={"ID":"d5d4dee4-992e-45c1-b58a-1a40a83b7c8c","Type":"ContainerDied","Data":"60090e468330cf46f66e42f4fb5bcc5b8d03784c9dcf66440e6ed1b4a29f72c1"} Apr 22 15:20:28.772961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772686 2577 scope.go:117] "RemoveContainer" containerID="209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34" Apr 22 15:20:28.772961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.772695 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74748b6745-5hk4w" Apr 22 15:20:28.780987 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.780966 2577 scope.go:117] "RemoveContainer" containerID="209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34" Apr 22 15:20:28.781256 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:20:28.781234 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34\": container with ID starting with 209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34 not found: ID does not exist" containerID="209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34" Apr 22 15:20:28.781315 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.781266 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34"} err="failed to get container status \"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34\": rpc error: code = NotFound desc = could not find container \"209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34\": container with ID starting with 209bf312938bc3ef605b28c2bf68f0a967839babf765bfd6267f9a36876e0b34 not found: ID does not exist" Apr 22 15:20:28.795281 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.795257 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:20:28.798946 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:28.798926 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74748b6745-5hk4w"] Apr 22 15:20:29.803253 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:20:29.803185 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" path="/var/lib/kubelet/pods/d5d4dee4-992e-45c1-b58a-1a40a83b7c8c/volumes" Apr 22 15:22:46.963440 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963406 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2nd8r"] Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963779 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" containerName="console" Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963792 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" containerName="console" Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963803 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" containerName="console" Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963808 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" containerName="console" Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963867 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5d4dee4-992e-45c1-b58a-1a40a83b7c8c" containerName="console" Apr 22 15:22:46.963899 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.963877 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9c8dae8-4c48-4154-aa6d-8b1ba62db900" containerName="console" Apr 22 15:22:46.966745 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.966730 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:46.969259 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.969238 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 15:22:46.969989 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.969970 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 15:22:46.969989 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.969982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-9pqgg\"" Apr 22 15:22:46.974032 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:46.974010 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2nd8r"] Apr 22 15:22:47.089039 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.089005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.089213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.089047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75vnm\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-kube-api-access-75vnm\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.189582 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.189550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.189772 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.189592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75vnm\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-kube-api-access-75vnm\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.197666 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.197638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.198210 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.198171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75vnm\" (UniqueName: \"kubernetes.io/projected/30c657d6-9843-4a84-bce5-aa22981be51f-kube-api-access-75vnm\") pod \"cert-manager-cainjector-68b757865b-2nd8r\" (UID: \"30c657d6-9843-4a84-bce5-aa22981be51f\") " pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.291247 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.291214 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" Apr 22 15:22:47.420481 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:47.420320 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-2nd8r"] Apr 22 15:22:47.422955 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:22:47.422925 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30c657d6_9843_4a84_bce5_aa22981be51f.slice/crio-23bb16c4aa222c456a0e832cda7d8358c68b7d6e29dcc6a4d1a61c88845ea234 WatchSource:0}: Error finding container 23bb16c4aa222c456a0e832cda7d8358c68b7d6e29dcc6a4d1a61c88845ea234: Status 404 returned error can't find the container with id 23bb16c4aa222c456a0e832cda7d8358c68b7d6e29dcc6a4d1a61c88845ea234 Apr 22 15:22:48.196430 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:48.196385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" event={"ID":"30c657d6-9843-4a84-bce5-aa22981be51f","Type":"ContainerStarted","Data":"23bb16c4aa222c456a0e832cda7d8358c68b7d6e29dcc6a4d1a61c88845ea234"} Apr 22 15:22:51.208333 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:51.208296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" event={"ID":"30c657d6-9843-4a84-bce5-aa22981be51f","Type":"ContainerStarted","Data":"a93cdd87902a70da7506a276a1cbe8725ab2b3ae820125a92e5395c08a4ec20e"} Apr 22 15:22:51.223654 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:22:51.223609 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-2nd8r" podStartSLOduration=2.190193558 podStartE2EDuration="5.223597232s" podCreationTimestamp="2026-04-22 15:22:46 +0000 UTC" firstStartedPulling="2026-04-22 15:22:47.425123436 +0000 UTC m=+850.211760474" lastFinishedPulling="2026-04-22 15:22:50.458527101 +0000 UTC m=+853.245164148" observedRunningTime="2026-04-22 15:22:51.222376958 +0000 UTC m=+854.009014018" watchObservedRunningTime="2026-04-22 15:22:51.223597232 +0000 UTC m=+854.010234294" Apr 22 15:23:37.739591 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:23:37.739562 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:23:37.742033 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:23:37.742010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:23:37.742502 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:23:37.742482 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:23:37.745483 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:23:37.745464 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:24:46.931550 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:46.931474 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9bc45bfd4-mxvcv"] Apr 22 15:24:46.934981 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:46.934957 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:46.949992 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:46.949965 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9bc45bfd4-mxvcv"] Apr 22 15:24:47.002686 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-service-ca\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.002839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-trusted-ca-bundle\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.002839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-oauth-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.002839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.002839 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002820 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.003013 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-oauth-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.003013 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.002890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxwkp\" (UniqueName: \"kubernetes.io/projected/2f7fdc91-ebf1-4129-be66-e40b976ace74-kube-api-access-kxwkp\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104084 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-service-ca\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-trusted-ca-bundle\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-oauth-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104234 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-oauth-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104303 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxwkp\" (UniqueName: \"kubernetes.io/projected/2f7fdc91-ebf1-4129-be66-e40b976ace74-kube-api-access-kxwkp\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.104906 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-service-ca\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.105018 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.104941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-oauth-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.105079 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.105028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.105343 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.105324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f7fdc91-ebf1-4129-be66-e40b976ace74-trusted-ca-bundle\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.107126 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.107102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-oauth-config\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.107351 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.107331 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f7fdc91-ebf1-4129-be66-e40b976ace74-console-serving-cert\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.113059 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.113042 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxwkp\" (UniqueName: \"kubernetes.io/projected/2f7fdc91-ebf1-4129-be66-e40b976ace74-kube-api-access-kxwkp\") pod \"console-9bc45bfd4-mxvcv\" (UID: \"2f7fdc91-ebf1-4129-be66-e40b976ace74\") " pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.245457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.245366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:47.373387 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.373360 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9bc45bfd4-mxvcv"] Apr 22 15:24:47.375276 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:24:47.375245 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7fdc91_ebf1_4129_be66_e40b976ace74.slice/crio-b1cde6fe7e434139fe021c3a08d4d7d3ba5ad89f1c209238f95936ecf234f775 WatchSource:0}: Error finding container b1cde6fe7e434139fe021c3a08d4d7d3ba5ad89f1c209238f95936ecf234f775: Status 404 returned error can't find the container with id b1cde6fe7e434139fe021c3a08d4d7d3ba5ad89f1c209238f95936ecf234f775 Apr 22 15:24:47.377089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.377072 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:24:47.556395 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.556354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc45bfd4-mxvcv" event={"ID":"2f7fdc91-ebf1-4129-be66-e40b976ace74","Type":"ContainerStarted","Data":"619df67c1cd112f6c663751f923a7656944a350837f881bb70645cbd82f1dc0d"} Apr 22 15:24:47.556395 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.556402 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9bc45bfd4-mxvcv" event={"ID":"2f7fdc91-ebf1-4129-be66-e40b976ace74","Type":"ContainerStarted","Data":"b1cde6fe7e434139fe021c3a08d4d7d3ba5ad89f1c209238f95936ecf234f775"} Apr 22 15:24:47.573771 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:47.573726 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9bc45bfd4-mxvcv" podStartSLOduration=1.573712493 podStartE2EDuration="1.573712493s" podCreationTimestamp="2026-04-22 15:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:24:47.573170593 +0000 UTC m=+970.359807655" watchObservedRunningTime="2026-04-22 15:24:47.573712493 +0000 UTC m=+970.360349553" Apr 22 15:24:57.245482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:57.245441 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:57.245873 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:57.245496 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:57.250280 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:57.250258 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:57.591961 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:57.591934 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9bc45bfd4-mxvcv" Apr 22 15:24:57.642685 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:24:57.642658 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:25:22.670790 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.670726 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5575fcffc4-cjbgc" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerName="console" containerID="cri-o://b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74" gracePeriod=15 Apr 22 15:25:22.680890 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.680865 2577 patch_prober.go:28] interesting pod/console-5575fcffc4-cjbgc container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.27:8443/health\": dial tcp 10.133.0.27:8443: connect: connection refused" start-of-body= Apr 22 15:25:22.680989 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.680922 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5575fcffc4-cjbgc" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerName="console" probeResult="failure" output="Get \"https://10.133.0.27:8443/health\": dial tcp 10.133.0.27:8443: connect: connection refused" Apr 22 15:25:22.912786 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.912765 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5575fcffc4-cjbgc_8df4cf41-e104-4b61-9f61-fa1efa52dba2/console/0.log" Apr 22 15:25:22.912892 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.912827 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:25:22.996459 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996384 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996459 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996420 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996459 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996448 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996482 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhcrx\" (UniqueName: \"kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996506 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996536 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996574 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config\") pod \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\" (UID: \"8df4cf41-e104-4b61-9f61-fa1efa52dba2\") " Apr 22 15:25:22.996942 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996829 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:25:22.996942 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.996882 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca" (OuterVolumeSpecName: "service-ca") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:25:22.997075 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.997049 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:25:22.997188 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.997165 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config" (OuterVolumeSpecName: "console-config") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:25:22.998784 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.998750 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:25:22.998880 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.998820 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx" (OuterVolumeSpecName: "kube-api-access-hhcrx") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "kube-api-access-hhcrx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:25:22.998880 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:22.998832 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8df4cf41-e104-4b61-9f61-fa1efa52dba2" (UID: "8df4cf41-e104-4b61-9f61-fa1efa52dba2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:25:23.098102 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098070 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-oauth-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098102 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098098 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-config\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098102 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098108 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-service-ca\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098335 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098116 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-trusted-ca-bundle\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098335 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098125 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8df4cf41-e104-4b61-9f61-fa1efa52dba2-oauth-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098335 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098133 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhcrx\" (UniqueName: \"kubernetes.io/projected/8df4cf41-e104-4b61-9f61-fa1efa52dba2-kube-api-access-hhcrx\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.098335 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.098141 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8df4cf41-e104-4b61-9f61-fa1efa52dba2-console-serving-cert\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:25:23.663903 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.663876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5575fcffc4-cjbgc_8df4cf41-e104-4b61-9f61-fa1efa52dba2/console/0.log" Apr 22 15:25:23.664085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.663915 2577 generic.go:358] "Generic (PLEG): container finished" podID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerID="b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74" exitCode=2 Apr 22 15:25:23.664085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.663948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5575fcffc4-cjbgc" event={"ID":"8df4cf41-e104-4b61-9f61-fa1efa52dba2","Type":"ContainerDied","Data":"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74"} Apr 22 15:25:23.664085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.663986 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5575fcffc4-cjbgc" Apr 22 15:25:23.664085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.663996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5575fcffc4-cjbgc" event={"ID":"8df4cf41-e104-4b61-9f61-fa1efa52dba2","Type":"ContainerDied","Data":"dac5f0cf850983e913c87b9ff3e6bfaff1d9f1a73b5756bfaf245a9911fd5400"} Apr 22 15:25:23.664085 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.664017 2577 scope.go:117] "RemoveContainer" containerID="b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74" Apr 22 15:25:23.672916 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.672755 2577 scope.go:117] "RemoveContainer" containerID="b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74" Apr 22 15:25:23.673138 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:25:23.672984 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74\": container with ID starting with b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74 not found: ID does not exist" containerID="b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74" Apr 22 15:25:23.673138 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.673009 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74"} err="failed to get container status \"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74\": rpc error: code = NotFound desc = could not find container \"b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74\": container with ID starting with b9b1cfae7302bf104a59bba41dcfdf9d64a8e6e86d947d999b623582a4125a74 not found: ID does not exist" Apr 22 15:25:23.686210 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.686170 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:25:23.696762 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.696740 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5575fcffc4-cjbgc"] Apr 22 15:25:23.803740 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:23.803714 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" path="/var/lib/kubelet/pods/8df4cf41-e104-4b61-9f61-fa1efa52dba2/volumes" Apr 22 15:25:48.468885 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.468849 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg"] Apr 22 15:25:48.469288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.469229 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerName="console" Apr 22 15:25:48.469288 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.469243 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerName="console" Apr 22 15:25:48.469359 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.469303 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8df4cf41-e104-4b61-9f61-fa1efa52dba2" containerName="console" Apr 22 15:25:48.473752 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.473733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:25:48.476495 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.476467 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"default-dockercfg-sbhg8\"" Apr 22 15:25:48.476620 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.476505 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"openshift-service-ca.crt\"" Apr 22 15:25:48.476620 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.476504 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"kube-root-ca.crt\"" Apr 22 15:25:48.486460 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.486435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg"] Apr 22 15:25:48.608575 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.608544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx7g9\" (UniqueName: \"kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9\") pod \"progression-enabled-node-0-0-7bzdg\" (UID: \"891aeac4-0a74-4b81-af81-f2b81c636bce\") " pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:25:48.709359 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.709318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx7g9\" (UniqueName: \"kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9\") pod \"progression-enabled-node-0-0-7bzdg\" (UID: \"891aeac4-0a74-4b81-af81-f2b81c636bce\") " pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:25:48.717119 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.717086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx7g9\" (UniqueName: \"kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9\") pod \"progression-enabled-node-0-0-7bzdg\" (UID: \"891aeac4-0a74-4b81-af81-f2b81c636bce\") " pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:25:48.784239 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.784211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:25:48.902949 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:48.902927 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg"] Apr 22 15:25:48.905430 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:25:48.905401 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod891aeac4_0a74_4b81_af81_f2b81c636bce.slice/crio-d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773 WatchSource:0}: Error finding container d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773: Status 404 returned error can't find the container with id d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773 Apr 22 15:25:49.747416 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:25:49.747377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" event={"ID":"891aeac4-0a74-4b81-af81-f2b81c636bce","Type":"ContainerStarted","Data":"d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773"} Apr 22 15:27:52.197765 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:27:52.197727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" event={"ID":"891aeac4-0a74-4b81-af81-f2b81c636bce","Type":"ContainerStarted","Data":"87b091de58da79c74cd9705494c4d3d8b86f52d7efa99aa882509f24d8026514"} Apr 22 15:27:52.198213 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:27:52.197865 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:27:52.234018 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:27:52.233949 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" podStartSLOduration=1.739211249 podStartE2EDuration="2m4.233932462s" podCreationTimestamp="2026-04-22 15:25:48 +0000 UTC" firstStartedPulling="2026-04-22 15:25:48.907421106 +0000 UTC m=+1031.694058147" lastFinishedPulling="2026-04-22 15:27:51.402142317 +0000 UTC m=+1154.188779360" observedRunningTime="2026-04-22 15:27:52.232116718 +0000 UTC m=+1155.018753777" watchObservedRunningTime="2026-04-22 15:27:52.233932462 +0000 UTC m=+1155.020569522" Apr 22 15:27:53.201429 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:27:53.201392 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:28:15.198783 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:15.198692 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" podUID="891aeac4-0a74-4b81-af81-f2b81c636bce" containerName="node" probeResult="failure" output="Get \"http://10.133.0.30:28080/metrics\": dial tcp 10.133.0.30:28080: connect: connection refused" Apr 22 15:28:15.282925 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:15.282891 2577 generic.go:358] "Generic (PLEG): container finished" podID="891aeac4-0a74-4b81-af81-f2b81c636bce" containerID="87b091de58da79c74cd9705494c4d3d8b86f52d7efa99aa882509f24d8026514" exitCode=0 Apr 22 15:28:15.283072 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:15.282963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" event={"ID":"891aeac4-0a74-4b81-af81-f2b81c636bce","Type":"ContainerDied","Data":"87b091de58da79c74cd9705494c4d3d8b86f52d7efa99aa882509f24d8026514"} Apr 22 15:28:16.412042 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:16.412018 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:28:16.504458 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:16.504426 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx7g9\" (UniqueName: \"kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9\") pod \"891aeac4-0a74-4b81-af81-f2b81c636bce\" (UID: \"891aeac4-0a74-4b81-af81-f2b81c636bce\") " Apr 22 15:28:16.506531 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:16.506502 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9" (OuterVolumeSpecName: "kube-api-access-wx7g9") pod "891aeac4-0a74-4b81-af81-f2b81c636bce" (UID: "891aeac4-0a74-4b81-af81-f2b81c636bce"). InnerVolumeSpecName "kube-api-access-wx7g9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:28:16.605994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:16.605929 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wx7g9\" (UniqueName: \"kubernetes.io/projected/891aeac4-0a74-4b81-af81-f2b81c636bce-kube-api-access-wx7g9\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:28:17.291336 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:17.291301 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" event={"ID":"891aeac4-0a74-4b81-af81-f2b81c636bce","Type":"ContainerDied","Data":"d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773"} Apr 22 15:28:17.291336 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:17.291320 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg" Apr 22 15:28:17.291336 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:17.291334 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bf98cbfc95107a211666a084983639f7a56da535a5bd81c5439f845b62b773" Apr 22 15:28:18.964873 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.964838 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx"] Apr 22 15:28:18.965270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.965185 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="891aeac4-0a74-4b81-af81-f2b81c636bce" containerName="node" Apr 22 15:28:18.965270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.965209 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="891aeac4-0a74-4b81-af81-f2b81c636bce" containerName="node" Apr 22 15:28:18.965270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.965268 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="891aeac4-0a74-4b81-af81-f2b81c636bce" containerName="node" Apr 22 15:28:18.995004 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.994979 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx"] Apr 22 15:28:18.995147 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.995082 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:18.997665 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.997639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"openshift-service-ca.crt\"" Apr 22 15:28:18.997838 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.997639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"kube-root-ca.crt\"" Apr 22 15:28:18.997838 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:18.997642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"default-dockercfg-sbhg8\"" Apr 22 15:28:19.125661 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:19.125632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjnvl\" (UniqueName: \"kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl\") pod \"progression-disabled-node-0-0-vscnx\" (UID: \"083c42ca-1a40-45e5-9a61-89b5a1deddec\") " pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:19.226809 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:19.226730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjnvl\" (UniqueName: \"kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl\") pod \"progression-disabled-node-0-0-vscnx\" (UID: \"083c42ca-1a40-45e5-9a61-89b5a1deddec\") " pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:19.236125 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:19.236102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjnvl\" (UniqueName: \"kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl\") pod \"progression-disabled-node-0-0-vscnx\" (UID: \"083c42ca-1a40-45e5-9a61-89b5a1deddec\") " pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:19.304168 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:19.304141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:19.426559 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:19.426535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx"] Apr 22 15:28:19.428476 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:28:19.428448 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod083c42ca_1a40_45e5_9a61_89b5a1deddec.slice/crio-90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11 WatchSource:0}: Error finding container 90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11: Status 404 returned error can't find the container with id 90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11 Apr 22 15:28:20.303182 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:20.303143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" event={"ID":"083c42ca-1a40-45e5-9a61-89b5a1deddec","Type":"ContainerStarted","Data":"55da5bb2f07c9b3ffc99ee4d0b23e90d9251eeefe2ddbacc1e5ef75fa7a35c6e"} Apr 22 15:28:20.303182 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:20.303179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" event={"ID":"083c42ca-1a40-45e5-9a61-89b5a1deddec","Type":"ContainerStarted","Data":"90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11"} Apr 22 15:28:20.303757 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:20.303266 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:20.320786 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:20.320738 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" podStartSLOduration=2.32072423 podStartE2EDuration="2.32072423s" podCreationTimestamp="2026-04-22 15:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:28:20.320420386 +0000 UTC m=+1183.107057451" watchObservedRunningTime="2026-04-22 15:28:20.32072423 +0000 UTC m=+1183.107361290" Apr 22 15:28:21.306492 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:21.306460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:37.764139 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:37.764108 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:28:37.767272 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:37.767244 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:28:37.768923 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:37.768901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:28:37.772039 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:37.772021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:28:42.587416 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:42.587362 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" probeResult="failure" output="Get \"http://10.133.0.31:28080/metrics\": read tcp 10.133.0.2:40068->10.133.0.31:28080: read: connection reset by peer" Apr 22 15:28:43.304260 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:43.304218 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" probeResult="failure" output="Get \"http://10.133.0.31:28080/metrics\": dial tcp 10.133.0.31:28080: connect: connection refused" Apr 22 15:28:43.304457 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:43.304344 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:43.304794 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:43.304769 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" probeResult="failure" output="Get \"http://10.133.0.31:28080/metrics\": dial tcp 10.133.0.31:28080: connect: connection refused" Apr 22 15:28:43.381654 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:43.381621 2577 generic.go:358] "Generic (PLEG): container finished" podID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerID="55da5bb2f07c9b3ffc99ee4d0b23e90d9251eeefe2ddbacc1e5ef75fa7a35c6e" exitCode=0 Apr 22 15:28:43.381654 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:43.381655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" event={"ID":"083c42ca-1a40-45e5-9a61-89b5a1deddec","Type":"ContainerDied","Data":"55da5bb2f07c9b3ffc99ee4d0b23e90d9251eeefe2ddbacc1e5ef75fa7a35c6e"} Apr 22 15:28:44.520491 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:44.520468 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:44.536934 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:44.536909 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjnvl\" (UniqueName: \"kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl\") pod \"083c42ca-1a40-45e5-9a61-89b5a1deddec\" (UID: \"083c42ca-1a40-45e5-9a61-89b5a1deddec\") " Apr 22 15:28:44.539173 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:44.539138 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl" (OuterVolumeSpecName: "kube-api-access-hjnvl") pod "083c42ca-1a40-45e5-9a61-89b5a1deddec" (UID: "083c42ca-1a40-45e5-9a61-89b5a1deddec"). InnerVolumeSpecName "kube-api-access-hjnvl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:28:44.638021 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:44.637954 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjnvl\" (UniqueName: \"kubernetes.io/projected/083c42ca-1a40-45e5-9a61-89b5a1deddec-kube-api-access-hjnvl\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:28:45.391083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:45.391050 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" Apr 22 15:28:45.391270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:45.391053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx" event={"ID":"083c42ca-1a40-45e5-9a61-89b5a1deddec","Type":"ContainerDied","Data":"90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11"} Apr 22 15:28:45.391270 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:45.391158 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c794807085d21d40b159e71008ad4dde504b5564b27cd898e355ebafcdfe11" Apr 22 15:28:53.981503 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.981468 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8"] Apr 22 15:28:53.981943 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.981828 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" Apr 22 15:28:53.981943 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.981839 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" Apr 22 15:28:53.981943 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.981906 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" containerName="node" Apr 22 15:28:53.984744 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.984729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:53.987107 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.987082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"openshift-service-ca.crt\"" Apr 22 15:28:53.987107 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.987093 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"default-dockercfg-sbhg8\"" Apr 22 15:28:53.987107 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.987086 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"kube-root-ca.crt\"" Apr 22 15:28:53.994678 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:53.994643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8"] Apr 22 15:28:54.017369 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.017343 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftq2j\" (UniqueName: \"kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j\") pod \"progression-invalid-node-0-0-qksz8\" (UID: \"6bd7a0c2-6526-488a-bbee-c668d4004f52\") " pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:54.118374 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.118347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftq2j\" (UniqueName: \"kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j\") pod \"progression-invalid-node-0-0-qksz8\" (UID: \"6bd7a0c2-6526-488a-bbee-c668d4004f52\") " pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:54.126402 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.126376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftq2j\" (UniqueName: \"kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j\") pod \"progression-invalid-node-0-0-qksz8\" (UID: \"6bd7a0c2-6526-488a-bbee-c668d4004f52\") " pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:54.294331 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.294291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:54.414246 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.414180 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8"] Apr 22 15:28:54.416922 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:28:54.416892 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bd7a0c2_6526_488a_bbee_c668d4004f52.slice/crio-ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2 WatchSource:0}: Error finding container ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2: Status 404 returned error can't find the container with id ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2 Apr 22 15:28:54.424662 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:54.424637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" event={"ID":"6bd7a0c2-6526-488a-bbee-c668d4004f52","Type":"ContainerStarted","Data":"ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2"} Apr 22 15:28:55.430174 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:55.430136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" event={"ID":"6bd7a0c2-6526-488a-bbee-c668d4004f52","Type":"ContainerStarted","Data":"dff4fb791050a05ff4b6bc0ee2dd0b38601b32a93a1231ec17dc1111e8b444de"} Apr 22 15:28:55.430578 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:55.430252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:28:55.445994 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:55.445940 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" podStartSLOduration=2.445925381 podStartE2EDuration="2.445925381s" podCreationTimestamp="2026-04-22 15:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:28:55.444235778 +0000 UTC m=+1218.230872838" watchObservedRunningTime="2026-04-22 15:28:55.445925381 +0000 UTC m=+1218.232562442" Apr 22 15:28:56.432648 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:28:56.432611 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:29:17.497759 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:17.497708 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" probeResult="failure" output="Get \"http://10.133.0.32:28080/metrics\": read tcp 10.133.0.2:45346->10.133.0.32:28080: read: connection reset by peer" Apr 22 15:29:18.431512 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:18.431473 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" probeResult="failure" output="Get \"http://10.133.0.32:28080/metrics\": dial tcp 10.133.0.32:28080: connect: connection refused" Apr 22 15:29:18.431704 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:18.431595 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:29:18.432123 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:18.432087 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" probeResult="failure" output="Get \"http://10.133.0.32:28080/metrics\": dial tcp 10.133.0.32:28080: connect: connection refused" Apr 22 15:29:18.508910 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:18.508880 2577 generic.go:358] "Generic (PLEG): container finished" podID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerID="dff4fb791050a05ff4b6bc0ee2dd0b38601b32a93a1231ec17dc1111e8b444de" exitCode=0 Apr 22 15:29:18.509284 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:18.508928 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" event={"ID":"6bd7a0c2-6526-488a-bbee-c668d4004f52","Type":"ContainerDied","Data":"dff4fb791050a05ff4b6bc0ee2dd0b38601b32a93a1231ec17dc1111e8b444de"} Apr 22 15:29:19.638910 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:19.638883 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:29:19.719556 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:19.719525 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftq2j\" (UniqueName: \"kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j\") pod \"6bd7a0c2-6526-488a-bbee-c668d4004f52\" (UID: \"6bd7a0c2-6526-488a-bbee-c668d4004f52\") " Apr 22 15:29:19.721487 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:19.721465 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j" (OuterVolumeSpecName: "kube-api-access-ftq2j") pod "6bd7a0c2-6526-488a-bbee-c668d4004f52" (UID: "6bd7a0c2-6526-488a-bbee-c668d4004f52"). InnerVolumeSpecName "kube-api-access-ftq2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:29:19.820659 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:19.820638 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftq2j\" (UniqueName: \"kubernetes.io/projected/6bd7a0c2-6526-488a-bbee-c668d4004f52-kube-api-access-ftq2j\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:29:20.515670 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:20.515634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" event={"ID":"6bd7a0c2-6526-488a-bbee-c668d4004f52","Type":"ContainerDied","Data":"ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2"} Apr 22 15:29:20.515670 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:20.515653 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8" Apr 22 15:29:20.515670 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:29:20.515665 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed53d075673ed0c7fcd0c816262cc10ce89b19161862c4777ebfd2a709429de2" Apr 22 15:31:15.670478 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.670446 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml"] Apr 22 15:31:15.670931 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.670787 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" Apr 22 15:31:15.670931 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.670798 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" Apr 22 15:31:15.670931 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.670856 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" containerName="node" Apr 22 15:31:15.673804 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.673788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:15.676072 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.676039 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"default-dockercfg-sbhg8\"" Apr 22 15:31:15.676870 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.676852 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"openshift-service-ca.crt\"" Apr 22 15:31:15.676987 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.676919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-tdbgv\"/\"kube-root-ca.crt\"" Apr 22 15:31:15.686228 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.685633 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml"] Apr 22 15:31:15.774686 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.774652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krd74\" (UniqueName: \"kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74\") pod \"progression-no-metrics-node-0-0-rscml\" (UID: \"ef692d06-d8e4-413a-b0c6-82975ae615e1\") " pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:15.875070 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.875039 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krd74\" (UniqueName: \"kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74\") pod \"progression-no-metrics-node-0-0-rscml\" (UID: \"ef692d06-d8e4-413a-b0c6-82975ae615e1\") " pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:15.883579 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.883557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krd74\" (UniqueName: \"kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74\") pod \"progression-no-metrics-node-0-0-rscml\" (UID: \"ef692d06-d8e4-413a-b0c6-82975ae615e1\") " pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:15.989321 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:15.989254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:16.108643 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:16.108617 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml"] Apr 22 15:31:16.111110 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:31:16.111084 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef692d06_d8e4_413a_b0c6_82975ae615e1.slice/crio-59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0 WatchSource:0}: Error finding container 59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0: Status 404 returned error can't find the container with id 59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0 Apr 22 15:31:16.113091 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:16.113075 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:31:16.915851 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:16.915812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" event={"ID":"ef692d06-d8e4-413a-b0c6-82975ae615e1","Type":"ContainerStarted","Data":"2b995da4c93937673ac473d0387bfb8f7436b01e44ca0a32155a6a314b773494"} Apr 22 15:31:16.915851 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:16.915850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" event={"ID":"ef692d06-d8e4-413a-b0c6-82975ae615e1","Type":"ContainerStarted","Data":"59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0"} Apr 22 15:31:16.933844 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:16.933800 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" podStartSLOduration=1.933786219 podStartE2EDuration="1.933786219s" podCreationTimestamp="2026-04-22 15:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:31:16.931545452 +0000 UTC m=+1359.718182512" watchObservedRunningTime="2026-04-22 15:31:16.933786219 +0000 UTC m=+1359.720423278" Apr 22 15:31:21.933051 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:21.933016 2577 generic.go:358] "Generic (PLEG): container finished" podID="ef692d06-d8e4-413a-b0c6-82975ae615e1" containerID="2b995da4c93937673ac473d0387bfb8f7436b01e44ca0a32155a6a314b773494" exitCode=0 Apr 22 15:31:21.933440 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:21.933093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" event={"ID":"ef692d06-d8e4-413a-b0c6-82975ae615e1","Type":"ContainerDied","Data":"2b995da4c93937673ac473d0387bfb8f7436b01e44ca0a32155a6a314b773494"} Apr 22 15:31:23.062930 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.062904 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:23.236278 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.236158 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krd74\" (UniqueName: \"kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74\") pod \"ef692d06-d8e4-413a-b0c6-82975ae615e1\" (UID: \"ef692d06-d8e4-413a-b0c6-82975ae615e1\") " Apr 22 15:31:23.238174 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.238145 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74" (OuterVolumeSpecName: "kube-api-access-krd74") pod "ef692d06-d8e4-413a-b0c6-82975ae615e1" (UID: "ef692d06-d8e4-413a-b0c6-82975ae615e1"). InnerVolumeSpecName "kube-api-access-krd74". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:31:23.337382 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.337357 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krd74\" (UniqueName: \"kubernetes.io/projected/ef692d06-d8e4-413a-b0c6-82975ae615e1-kube-api-access-krd74\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:31:23.941988 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.941962 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" Apr 22 15:31:23.941988 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.941982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml" event={"ID":"ef692d06-d8e4-413a-b0c6-82975ae615e1","Type":"ContainerDied","Data":"59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0"} Apr 22 15:31:23.942177 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:23.942012 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f77526fc4d092dd1d56cf9dfa7b3aeaa73d74b86a7a95dcc53522d711195c0" Apr 22 15:31:28.175941 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.175907 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sg7ld/must-gather-hjlb4"] Apr 22 15:31:28.176464 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.176329 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef692d06-d8e4-413a-b0c6-82975ae615e1" containerName="node" Apr 22 15:31:28.176464 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.176348 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef692d06-d8e4-413a-b0c6-82975ae615e1" containerName="node" Apr 22 15:31:28.176464 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.176427 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef692d06-d8e4-413a-b0c6-82975ae615e1" containerName="node" Apr 22 15:31:28.179702 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.179684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.181891 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.181867 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sg7ld\"/\"openshift-service-ca.crt\"" Apr 22 15:31:28.181982 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.181959 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sg7ld\"/\"kube-root-ca.crt\"" Apr 22 15:31:28.182741 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.182724 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sg7ld\"/\"default-dockercfg-54v8l\"" Apr 22 15:31:28.185820 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.185802 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sg7ld/must-gather-hjlb4"] Apr 22 15:31:28.278083 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.278047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29vrg\" (UniqueName: \"kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.278271 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.278093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.379063 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.379030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29vrg\" (UniqueName: \"kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.379227 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.379083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.379514 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.379496 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.386671 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.386651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29vrg\" (UniqueName: \"kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg\") pod \"must-gather-hjlb4\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.488949 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.488882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:31:28.607990 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.607963 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sg7ld/must-gather-hjlb4"] Apr 22 15:31:28.610024 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:31:28.609997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod838dbe94_9e5f_4e14_90bb_acff40dc38f7.slice/crio-190f4e72828299fc9e3ddcf05f1cfe14da471feaed33c5788f0d9dbca5348be0 WatchSource:0}: Error finding container 190f4e72828299fc9e3ddcf05f1cfe14da471feaed33c5788f0d9dbca5348be0: Status 404 returned error can't find the container with id 190f4e72828299fc9e3ddcf05f1cfe14da471feaed33c5788f0d9dbca5348be0 Apr 22 15:31:28.958967 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:28.958936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" event={"ID":"838dbe94-9e5f-4e14-90bb-acff40dc38f7","Type":"ContainerStarted","Data":"190f4e72828299fc9e3ddcf05f1cfe14da471feaed33c5788f0d9dbca5348be0"} Apr 22 15:31:32.758451 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.758404 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx"] Apr 22 15:31:32.764235 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.764187 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-disabled-node-0-0-vscnx"] Apr 22 15:31:32.770485 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.770455 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg"] Apr 22 15:31:32.774850 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.774828 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-enabled-node-0-0-7bzdg"] Apr 22 15:31:32.779491 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.779466 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8"] Apr 22 15:31:32.783291 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.783263 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-invalid-node-0-0-qksz8"] Apr 22 15:31:32.798445 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.798418 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml"] Apr 22 15:31:32.801905 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:32.801881 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-tdbgv/progression-no-metrics-node-0-0-rscml"] Apr 22 15:31:33.804609 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:33.804568 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="083c42ca-1a40-45e5-9a61-89b5a1deddec" path="/var/lib/kubelet/pods/083c42ca-1a40-45e5-9a61-89b5a1deddec/volumes" Apr 22 15:31:33.805010 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:33.804968 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd7a0c2-6526-488a-bbee-c668d4004f52" path="/var/lib/kubelet/pods/6bd7a0c2-6526-488a-bbee-c668d4004f52/volumes" Apr 22 15:31:33.805381 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:33.805361 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="891aeac4-0a74-4b81-af81-f2b81c636bce" path="/var/lib/kubelet/pods/891aeac4-0a74-4b81-af81-f2b81c636bce/volumes" Apr 22 15:31:33.805707 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:33.805693 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef692d06-d8e4-413a-b0c6-82975ae615e1" path="/var/lib/kubelet/pods/ef692d06-d8e4-413a-b0c6-82975ae615e1/volumes" Apr 22 15:31:34.991793 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:34.991748 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" event={"ID":"838dbe94-9e5f-4e14-90bb-acff40dc38f7","Type":"ContainerStarted","Data":"58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a"} Apr 22 15:31:34.992293 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:34.991800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" event={"ID":"838dbe94-9e5f-4e14-90bb-acff40dc38f7","Type":"ContainerStarted","Data":"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f"} Apr 22 15:31:35.007966 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:35.007914 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" podStartSLOduration=1.245898301 podStartE2EDuration="7.007900073s" podCreationTimestamp="2026-04-22 15:31:28 +0000 UTC" firstStartedPulling="2026-04-22 15:31:28.611692247 +0000 UTC m=+1371.398329285" lastFinishedPulling="2026-04-22 15:31:34.373693995 +0000 UTC m=+1377.160331057" observedRunningTime="2026-04-22 15:31:35.007185235 +0000 UTC m=+1377.793822306" watchObservedRunningTime="2026-04-22 15:31:35.007900073 +0000 UTC m=+1377.794537160" Apr 22 15:31:54.064811 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:54.064776 2577 generic.go:358] "Generic (PLEG): container finished" podID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerID="a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f" exitCode=0 Apr 22 15:31:54.065330 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:54.064853 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" event={"ID":"838dbe94-9e5f-4e14-90bb-acff40dc38f7","Type":"ContainerDied","Data":"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f"} Apr 22 15:31:54.065330 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:54.065316 2577 scope.go:117] "RemoveContainer" containerID="a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f" Apr 22 15:31:54.953131 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:54.953105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sg7ld_must-gather-hjlb4_838dbe94-9e5f-4e14-90bb-acff40dc38f7/gather/0.log" Apr 22 15:31:58.083250 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:58.083215 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-t86n6_8348847f-4e3a-43f6-bbab-5b6d67eff9fd/global-pull-secret-syncer/0.log" Apr 22 15:31:58.151781 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:58.151756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p49zs_7c6e1747-9197-468a-b61e-0e687eab6eaa/konnectivity-agent/0.log" Apr 22 15:31:58.225339 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:31:58.225312 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-228.ec2.internal_4f00231807412dde1ca296d35e880b1b/haproxy/0.log" Apr 22 15:32:00.315295 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.315263 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sg7ld/must-gather-hjlb4"] Apr 22 15:32:00.315755 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.315475 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="copy" containerID="cri-o://58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a" gracePeriod=2 Apr 22 15:32:00.317667 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.317630 2577 status_manager.go:895] "Failed to get status for pod" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" err="pods \"must-gather-hjlb4\" is forbidden: User \"system:node:ip-10-0-137-228.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sg7ld\": no relationship found between node 'ip-10-0-137-228.ec2.internal' and this object" Apr 22 15:32:00.319091 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.319066 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sg7ld/must-gather-hjlb4"] Apr 22 15:32:00.548713 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.548688 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sg7ld_must-gather-hjlb4_838dbe94-9e5f-4e14-90bb-acff40dc38f7/copy/0.log" Apr 22 15:32:00.549065 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.549049 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:32:00.552152 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.552125 2577 status_manager.go:895] "Failed to get status for pod" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" err="pods \"must-gather-hjlb4\" is forbidden: User \"system:node:ip-10-0-137-228.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sg7ld\": no relationship found between node 'ip-10-0-137-228.ec2.internal' and this object" Apr 22 15:32:00.666496 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.666428 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output\") pod \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " Apr 22 15:32:00.666496 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.666485 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29vrg\" (UniqueName: \"kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg\") pod \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\" (UID: \"838dbe94-9e5f-4e14-90bb-acff40dc38f7\") " Apr 22 15:32:00.668545 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.668518 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "838dbe94-9e5f-4e14-90bb-acff40dc38f7" (UID: "838dbe94-9e5f-4e14-90bb-acff40dc38f7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:32:00.668636 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.668573 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg" (OuterVolumeSpecName: "kube-api-access-29vrg") pod "838dbe94-9e5f-4e14-90bb-acff40dc38f7" (UID: "838dbe94-9e5f-4e14-90bb-acff40dc38f7"). InnerVolumeSpecName "kube-api-access-29vrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:32:00.767108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.767072 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/838dbe94-9e5f-4e14-90bb-acff40dc38f7-must-gather-output\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:32:00.767108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:00.767107 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29vrg\" (UniqueName: \"kubernetes.io/projected/838dbe94-9e5f-4e14-90bb-acff40dc38f7-kube-api-access-29vrg\") on node \"ip-10-0-137-228.ec2.internal\" DevicePath \"\"" Apr 22 15:32:01.094020 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.093993 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sg7ld_must-gather-hjlb4_838dbe94-9e5f-4e14-90bb-acff40dc38f7/copy/0.log" Apr 22 15:32:01.094375 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.094351 2577 generic.go:358] "Generic (PLEG): container finished" podID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerID="58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a" exitCode=143 Apr 22 15:32:01.094485 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.094407 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" Apr 22 15:32:01.094485 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.094425 2577 scope.go:117] "RemoveContainer" containerID="58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a" Apr 22 15:32:01.096481 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.096451 2577 status_manager.go:895] "Failed to get status for pod" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" err="pods \"must-gather-hjlb4\" is forbidden: User \"system:node:ip-10-0-137-228.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sg7ld\": no relationship found between node 'ip-10-0-137-228.ec2.internal' and this object" Apr 22 15:32:01.103417 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.103385 2577 scope.go:117] "RemoveContainer" containerID="a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f" Apr 22 15:32:01.105829 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.105803 2577 status_manager.go:895] "Failed to get status for pod" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" pod="openshift-must-gather-sg7ld/must-gather-hjlb4" err="pods \"must-gather-hjlb4\" is forbidden: User \"system:node:ip-10-0-137-228.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-sg7ld\": no relationship found between node 'ip-10-0-137-228.ec2.internal' and this object" Apr 22 15:32:01.115701 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.115684 2577 scope.go:117] "RemoveContainer" containerID="58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a" Apr 22 15:32:01.115959 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:32:01.115942 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a\": container with ID starting with 58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a not found: ID does not exist" containerID="58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a" Apr 22 15:32:01.116004 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.115968 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a"} err="failed to get container status \"58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a\": rpc error: code = NotFound desc = could not find container \"58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a\": container with ID starting with 58215ebb1ae6a517ffd9f2396bf7f4af4fb55038f8f649d20d7a0c9800365e8a not found: ID does not exist" Apr 22 15:32:01.116004 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.115990 2577 scope.go:117] "RemoveContainer" containerID="a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f" Apr 22 15:32:01.116237 ip-10-0-137-228 kubenswrapper[2577]: E0422 15:32:01.116216 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f\": container with ID starting with a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f not found: ID does not exist" containerID="a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f" Apr 22 15:32:01.116290 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.116243 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f"} err="failed to get container status \"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f\": rpc error: code = NotFound desc = could not find container \"a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f\": container with ID starting with a36df350ec25800f24d8655de7f0ef608006a286d60e708613c07466d512c87f not found: ID does not exist" Apr 22 15:32:01.393244 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.393148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/alertmanager/0.log" Apr 22 15:32:01.422243 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.422218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/config-reloader/0.log" Apr 22 15:32:01.453092 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.453069 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/kube-rbac-proxy-web/0.log" Apr 22 15:32:01.476738 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.476719 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/kube-rbac-proxy/0.log" Apr 22 15:32:01.504014 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.503990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/kube-rbac-proxy-metric/0.log" Apr 22 15:32:01.531300 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.531275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/prom-label-proxy/0.log" Apr 22 15:32:01.556736 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.556718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8ef4aadb-7584-494f-b7ec-96fed3eaee8d/init-config-reloader/0.log" Apr 22 15:32:01.594884 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.594864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6p57k_9070e43c-98ec-4211-8d57-0154d4934914/cluster-monitoring-operator/0.log" Apr 22 15:32:01.727593 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.727521 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-r9cgr_b90d9259-ac80-4f8e-a1d4-34cc591dc37f/monitoring-plugin/0.log" Apr 22 15:32:01.809711 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.809663 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" path="/var/lib/kubelet/pods/838dbe94-9e5f-4e14-90bb-acff40dc38f7/volumes" Apr 22 15:32:01.862734 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.862712 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-75wkc_535d8fcd-432a-4fa0-a4d0-4c0c54323776/node-exporter/0.log" Apr 22 15:32:01.888746 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.888718 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-75wkc_535d8fcd-432a-4fa0-a4d0-4c0c54323776/kube-rbac-proxy/0.log" Apr 22 15:32:01.912697 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:01.912672 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-75wkc_535d8fcd-432a-4fa0-a4d0-4c0c54323776/init-textfile/0.log" Apr 22 15:32:02.023070 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.023046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l7k5k_572626ce-7b81-451b-b464-a73e55d35d02/kube-rbac-proxy-main/0.log" Apr 22 15:32:02.046179 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.046155 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l7k5k_572626ce-7b81-451b-b464-a73e55d35d02/kube-rbac-proxy-self/0.log" Apr 22 15:32:02.072475 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.072454 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-l7k5k_572626ce-7b81-451b-b464-a73e55d35d02/openshift-state-metrics/0.log" Apr 22 15:32:02.107958 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.107935 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/prometheus/0.log" Apr 22 15:32:02.134446 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.134426 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/config-reloader/0.log" Apr 22 15:32:02.162347 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.162322 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/thanos-sidecar/0.log" Apr 22 15:32:02.222809 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.222787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/kube-rbac-proxy-web/0.log" Apr 22 15:32:02.267414 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.267386 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/kube-rbac-proxy/0.log" Apr 22 15:32:02.331477 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.331403 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/kube-rbac-proxy-thanos/0.log" Apr 22 15:32:02.356552 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.356528 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fa60e528-1e5f-41c3-a1bf-51d4e6f07f88/init-config-reloader/0.log" Apr 22 15:32:02.386107 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.386074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qmc6p_3b31439e-0cf1-40d4-844b-59c2435526a4/prometheus-operator/0.log" Apr 22 15:32:02.406790 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.406769 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-qmc6p_3b31439e-0cf1-40d4-844b-59c2435526a4/kube-rbac-proxy/0.log" Apr 22 15:32:02.481927 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.481896 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f5f55ddc7-66h44_e93874f8-9ce0-4a04-8e9c-8ae407239e13/telemeter-client/0.log" Apr 22 15:32:02.512515 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.512497 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f5f55ddc7-66h44_e93874f8-9ce0-4a04-8e9c-8ae407239e13/reload/0.log" Apr 22 15:32:02.545228 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:02.545206 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5f5f55ddc7-66h44_e93874f8-9ce0-4a04-8e9c-8ae407239e13/kube-rbac-proxy/0.log" Apr 22 15:32:03.691789 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:03.691722 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-7htkt_a4fb852c-ad08-434d-abea-ab07a5423921/networking-console-plugin/0.log" Apr 22 15:32:04.120510 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.120475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/1.log" Apr 22 15:32:04.133147 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.133121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-jztj7_8fe0a454-c595-4c12-b2ff-afc448fddec1/console-operator/2.log" Apr 22 15:32:04.518801 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.518774 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9bc45bfd4-mxvcv_2f7fdc91-ebf1-4129-be66-e40b976ace74/console/0.log" Apr 22 15:32:04.565565 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.565532 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7knnb_7e2980bd-3504-4803-9825-ab03e37698f6/download-server/0.log" Apr 22 15:32:04.719741 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.719710 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l"] Apr 22 15:32:04.720108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720040 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="gather" Apr 22 15:32:04.720108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720051 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="gather" Apr 22 15:32:04.720108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720081 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="copy" Apr 22 15:32:04.720108 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720090 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="copy" Apr 22 15:32:04.720305 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720146 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="copy" Apr 22 15:32:04.720305 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.720160 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="838dbe94-9e5f-4e14-90bb-acff40dc38f7" containerName="gather" Apr 22 15:32:04.723264 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.723245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.725348 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.725329 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nrz24\"/\"kube-root-ca.crt\"" Apr 22 15:32:04.726347 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.726326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nrz24\"/\"openshift-service-ca.crt\"" Apr 22 15:32:04.726460 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.726351 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nrz24\"/\"default-dockercfg-klhx2\"" Apr 22 15:32:04.730106 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.730070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l"] Apr 22 15:32:04.800706 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.800639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzrt\" (UniqueName: \"kubernetes.io/projected/5b617489-e1e5-478f-9595-dc9a97dcbe4f-kube-api-access-5rzrt\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.800706 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.800671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-podres\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.800706 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.800701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-sys\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.800882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.800723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-lib-modules\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.800882 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.800781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-proc\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901424 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzrt\" (UniqueName: \"kubernetes.io/projected/5b617489-e1e5-478f-9595-dc9a97dcbe4f-kube-api-access-5rzrt\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901424 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-podres\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-sys\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-lib-modules\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-proc\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901530 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-sys\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-podres\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901616 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-proc\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.901919 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.901668 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b617489-e1e5-478f-9595-dc9a97dcbe4f-lib-modules\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.909089 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.909066 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzrt\" (UniqueName: \"kubernetes.io/projected/5b617489-e1e5-478f-9595-dc9a97dcbe4f-kube-api-access-5rzrt\") pod \"perf-node-gather-daemonset-rbj5l\" (UID: \"5b617489-e1e5-478f-9595-dc9a97dcbe4f\") " pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:04.974852 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:04.974824 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-m7bf5_af68fd91-9826-405c-b2a7-d2ea31c49737/volume-data-source-validator/0.log" Apr 22 15:32:05.034406 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:05.034381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:05.151414 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:05.151344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l"] Apr 22 15:32:05.153756 ip-10-0-137-228 kubenswrapper[2577]: W0422 15:32:05.153724 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b617489_e1e5_478f_9595_dc9a97dcbe4f.slice/crio-944467bc35d916a1cbe3b3eee105aff0663cb089de3a93a67bfdd954f84a74bb WatchSource:0}: Error finding container 944467bc35d916a1cbe3b3eee105aff0663cb089de3a93a67bfdd954f84a74bb: Status 404 returned error can't find the container with id 944467bc35d916a1cbe3b3eee105aff0663cb089de3a93a67bfdd954f84a74bb Apr 22 15:32:05.626885 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:05.626850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9thxk_99b60141-c5a1-4685-b0c9-f59380bb89b8/dns/0.log" Apr 22 15:32:05.647917 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:05.647888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9thxk_99b60141-c5a1-4685-b0c9-f59380bb89b8/kube-rbac-proxy/0.log" Apr 22 15:32:05.764657 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:05.764627 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4sxvb_031a7138-6b28-4cf1-9f28-ca9c3f9e3225/dns-node-resolver/0.log" Apr 22 15:32:06.113836 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:06.113805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" event={"ID":"5b617489-e1e5-478f-9595-dc9a97dcbe4f","Type":"ContainerStarted","Data":"fa4bc223f6cd29812ac48e65155adf9e9ec475118f931a2bb07d8f9c22a19dc5"} Apr 22 15:32:06.113836 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:06.113837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" event={"ID":"5b617489-e1e5-478f-9595-dc9a97dcbe4f","Type":"ContainerStarted","Data":"944467bc35d916a1cbe3b3eee105aff0663cb089de3a93a67bfdd954f84a74bb"} Apr 22 15:32:06.114052 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:06.113933 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:06.131709 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:06.131668 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" podStartSLOduration=2.131653605 podStartE2EDuration="2.131653605s" podCreationTimestamp="2026-04-22 15:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:32:06.129562027 +0000 UTC m=+1408.916199130" watchObservedRunningTime="2026-04-22 15:32:06.131653605 +0000 UTC m=+1408.918290668" Apr 22 15:32:06.289672 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:06.289621 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vrpzt_d075efdc-d5f5-490a-a543-09e52a1f9e38/node-ca/0.log" Apr 22 15:32:07.370689 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.370656 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-nlf5r_c6c9ff67-fc53-4fad-bac9-aa152e2c0640/serve-healthcheck-canary/0.log" Apr 22 15:32:07.700217 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.700114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w75nz_ca3013d5-791d-4be7-9302-a69cf49ab049/insights-operator/0.log" Apr 22 15:32:07.700704 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.700676 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w75nz_ca3013d5-791d-4be7-9302-a69cf49ab049/insights-operator/1.log" Apr 22 15:32:07.720927 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.720899 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2l9ld_672858ed-71c0-4480-881d-f921a89639d3/kube-rbac-proxy/0.log" Apr 22 15:32:07.742658 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.742635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2l9ld_672858ed-71c0-4480-881d-f921a89639d3/exporter/0.log" Apr 22 15:32:07.764151 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:07.764133 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2l9ld_672858ed-71c0-4480-881d-f921a89639d3/extractor/0.log" Apr 22 15:32:12.126778 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:12.126750 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nrz24/perf-node-gather-daemonset-rbj5l" Apr 22 15:32:12.910305 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:12.910278 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gjmkn_09812955-b6a7-49e2-9f95-13ab00645d14/kube-storage-version-migrator-operator/1.log" Apr 22 15:32:12.911187 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:12.911166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gjmkn_09812955-b6a7-49e2-9f95-13ab00645d14/kube-storage-version-migrator-operator/0.log" Apr 22 15:32:13.933482 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:13.933452 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-94cqk_4bd5d6e7-5c42-493b-9b0c-b9dd5bf3d177/kube-multus/0.log" Apr 22 15:32:14.123732 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.123657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/kube-multus-additional-cni-plugins/0.log" Apr 22 15:32:14.147097 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.147074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/egress-router-binary-copy/0.log" Apr 22 15:32:14.169120 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.169096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/cni-plugins/0.log" Apr 22 15:32:14.190314 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.190292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/bond-cni-plugin/0.log" Apr 22 15:32:14.215243 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.215214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/routeoverride-cni/0.log" Apr 22 15:32:14.240385 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.240365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/whereabouts-cni-bincopy/0.log" Apr 22 15:32:14.266747 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.266730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sg7kx_daf221a4-075f-4ecb-83fb-afb1b4d25997/whereabouts-cni/0.log" Apr 22 15:32:14.550588 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.550559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9nk69_99c788ee-8bf0-4eb7-9e35-f464df2ca01e/network-metrics-daemon/0.log" Apr 22 15:32:14.571060 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:14.571037 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-9nk69_99c788ee-8bf0-4eb7-9e35-f464df2ca01e/kube-rbac-proxy/0.log" Apr 22 15:32:15.409259 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.409231 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-controller/0.log" Apr 22 15:32:15.430079 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.430059 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/0.log" Apr 22 15:32:15.437716 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.437694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovn-acl-logging/1.log" Apr 22 15:32:15.457351 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.457331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/kube-rbac-proxy-node/0.log" Apr 22 15:32:15.481897 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.481879 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 15:32:15.502123 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.502105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/northd/0.log" Apr 22 15:32:15.525725 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.525700 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/nbdb/0.log" Apr 22 15:32:15.546809 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.546788 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/sbdb/0.log" Apr 22 15:32:15.647008 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:15.646931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-42xf8_82f59d25-cb9b-4bfc-a131-c631b53ef9c3/ovnkube-controller/0.log" Apr 22 15:32:17.120358 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:17.120327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-g7h5c_51174689-eee9-47c4-95c1-890adba68f5a/check-endpoints/0.log" Apr 22 15:32:17.146175 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:17.146154 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bvrrk_c7876708-f581-4c0c-becb-c7c90e442cda/network-check-target-container/0.log" Apr 22 15:32:18.083787 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:18.083748 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-p49tx_5fcf9aef-8476-4cab-aa68-0f61db3e03f3/iptables-alerter/0.log" Apr 22 15:32:18.796540 ip-10-0-137-228 kubenswrapper[2577]: I0422 15:32:18.796516 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-s5jjg_ab887213-98f0-4051-a99d-d23453b1ec24/tuned/0.log"