Apr 20 10:01:25.872564 ip-10-0-138-148 systemd[1]: Starting Kubernetes Kubelet... Apr 20 10:01:26.284172 ip-10-0-138-148 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:26.284172 ip-10-0-138-148 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 10:01:26.284172 ip-10-0-138-148 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:26.284172 ip-10-0-138-148 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 10:01:26.284172 ip-10-0-138-148 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 10:01:26.285238 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.285152 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 10:01:26.293589 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293570 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:26.293589 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293584 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:26.293589 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293588 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:26.293589 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293592 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293596 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293600 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293603 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293606 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293609 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293612 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293615 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293618 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293622 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293626 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293629 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293632 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293635 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293638 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293640 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293643 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293645 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293648 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293650 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:26.293735 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293653 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293656 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293659 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293661 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293664 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293666 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293669 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293671 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293674 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293677 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293680 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293682 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293685 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293688 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293692 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293695 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293698 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293700 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293703 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293706 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:26.294204 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293709 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293711 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293714 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293716 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293719 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293722 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293724 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293727 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293729 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293732 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293735 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293738 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293740 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293743 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293746 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293749 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293751 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293754 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293757 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:26.294749 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293761 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293765 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293768 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293771 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293776 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293781 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293785 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293789 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293792 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293795 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293798 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293801 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293803 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293806 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293808 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293811 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293814 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293817 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293820 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293822 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:26.295241 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293825 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293827 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293829 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.293832 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294184 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294189 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294191 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294194 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294197 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294199 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294202 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294204 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294207 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294210 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294213 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294216 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294219 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294221 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294224 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:26.295760 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294227 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294230 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294233 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294236 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294238 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294241 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294243 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294246 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294248 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294251 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294254 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294256 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294259 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294261 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294265 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294268 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294271 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294274 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294276 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:26.296231 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294279 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294281 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294284 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294286 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294289 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294292 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294294 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294297 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294299 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294319 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294322 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294324 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294327 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294330 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294333 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294336 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294339 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294341 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294344 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294346 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:26.296719 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294349 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294352 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294354 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294357 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294360 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294362 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294365 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294368 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294370 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294373 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294376 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294379 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294381 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294384 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294386 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294389 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294392 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294395 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294397 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:26.297212 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294400 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294404 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294407 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294411 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294414 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294417 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294421 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294424 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294427 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294430 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294432 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294435 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.294437 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.295982 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.295991 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.295999 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296003 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296008 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296012 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296016 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296021 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 10:01:26.297696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296025 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296028 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296031 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296035 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296038 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296041 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296044 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296047 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296049 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296052 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296055 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296059 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296062 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296065 2577 flags.go:64] FLAG: --config-dir="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296068 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296072 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296076 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296079 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296082 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296085 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296088 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296091 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296094 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296097 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296100 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 10:01:26.298215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296105 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296108 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296111 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296114 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296117 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296120 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296125 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296128 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296130 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296133 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296136 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296140 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296143 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296146 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296149 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296152 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296155 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296159 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296162 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296165 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296168 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296171 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296175 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296177 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296181 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 10:01:26.298829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296184 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296187 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296190 2577 flags.go:64] FLAG: --help="false" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296192 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296196 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296198 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296201 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296204 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296208 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296211 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296213 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296216 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296219 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296222 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296225 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296229 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296232 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296235 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296238 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296241 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296244 2577 flags.go:64] FLAG: --lock-file="" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296247 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296250 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296253 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 10:01:26.299452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296258 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296261 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296264 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296267 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296270 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296274 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296277 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296279 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296284 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296287 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296291 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296294 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296297 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296299 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296316 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296320 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296323 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296326 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296336 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296339 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296342 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296346 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296349 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 10:01:26.300053 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296355 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296358 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296361 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296364 2577 flags.go:64] FLAG: --port="10250" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296367 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296370 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0312fa60526fc5f98" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296373 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296376 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296379 2577 flags.go:64] FLAG: --register-node="true" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296383 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296386 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296389 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296392 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296395 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296398 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296402 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296405 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296408 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296411 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296414 2577 flags.go:64] FLAG: --runonce="false" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296417 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296420 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296423 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296425 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296428 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296431 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 10:01:26.300633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296434 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296437 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296440 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296443 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296446 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296449 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296453 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296456 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296458 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296464 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296466 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296469 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296473 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296476 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296479 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296482 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296485 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296488 2577 flags.go:64] FLAG: --v="2" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296492 2577 flags.go:64] FLAG: --version="false" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296496 2577 flags.go:64] FLAG: --vmodule="" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296500 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.296503 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296590 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296594 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:26.301287 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296598 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296601 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296604 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296607 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296610 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296612 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296615 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296618 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296620 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296622 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296625 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296628 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296630 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296633 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296636 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296639 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296642 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296644 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296647 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296649 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:26.301907 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296652 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296654 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296661 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296664 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296666 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296669 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296671 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296674 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296677 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296679 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296682 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296684 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296686 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296689 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296692 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296695 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296697 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296700 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296704 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:26.302449 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296706 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296709 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296712 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296714 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296717 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296719 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296722 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296726 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296731 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296734 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296738 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296742 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296745 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296747 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296750 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296754 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296756 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296759 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296761 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296764 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:26.302924 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296766 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296769 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296771 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296774 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296777 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296779 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296782 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296784 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296786 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296789 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296792 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296796 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296798 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296801 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296804 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296806 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296809 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296812 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296814 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:26.303542 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296816 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296819 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296824 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296827 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296829 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.296832 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:26.304011 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.297589 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:26.304533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.304515 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 10:01:26.304571 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.304534 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 10:01:26.304603 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304596 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:26.304603 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304601 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304605 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304608 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304611 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304614 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304617 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304620 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304622 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304625 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304628 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304630 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304633 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304635 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304638 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304641 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304643 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304646 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304648 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304651 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304654 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:26.304658 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304656 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304659 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304662 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304665 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304668 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304671 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304674 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304676 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304679 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304681 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304684 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304686 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304689 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304691 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304694 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304696 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304698 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304701 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304703 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304706 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:26.305152 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304708 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304711 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304713 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304716 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304718 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304720 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304723 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304725 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304728 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304730 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304733 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304735 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304739 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304744 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304748 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304751 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304753 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304757 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304759 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:26.305664 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304762 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304764 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304767 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304770 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304772 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304775 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304777 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304780 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304782 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304785 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304788 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304791 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304795 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304797 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304799 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304802 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304805 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304808 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304810 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:26.306124 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304813 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304815 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304818 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304821 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304823 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304826 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304828 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.304833 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304922 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304927 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304930 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304933 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304936 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304940 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304943 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 10:01:26.306616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304945 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304948 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304950 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304953 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304955 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304958 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304960 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304963 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304965 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304968 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304970 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304973 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304975 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304977 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304980 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304984 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304987 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304990 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304992 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 10:01:26.306989 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304995 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.304997 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305000 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305002 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305005 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305007 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305010 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305014 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305018 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305021 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305023 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305027 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305029 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305032 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305035 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305037 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305040 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305042 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305045 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 10:01:26.307536 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305047 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305050 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305052 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305055 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305057 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305059 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305062 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305064 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305067 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305069 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305072 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305075 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305078 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305080 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305082 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305085 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305087 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305090 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305092 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305095 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 10:01:26.308004 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305098 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305100 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305103 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305105 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305107 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305110 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305113 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305115 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305118 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305120 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305122 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305125 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305127 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305130 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305132 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305135 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305137 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305140 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305142 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305144 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 10:01:26.308519 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:26.305147 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.305151 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.305242 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.307653 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.308662 2577 server.go:1019] "Starting client certificate rotation" Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.308756 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:26.308999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.308791 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 10:01:26.338527 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.338510 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:26.341066 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.341048 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 10:01:26.363100 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.363081 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 10:01:26.366760 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.366742 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:26.370026 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.370012 2577 log.go:25] "Validated CRI v1 image API" Apr 20 10:01:26.371370 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.371355 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 10:01:26.373861 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.373843 2577 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 85bd226d-09ce-4860-95d1-721764203640:/dev/nvme0n1p4 d8fd2c84-f90a-4ec6-baa4-361e705e2fe2:/dev/nvme0n1p3] Apr 20 10:01:26.373904 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.373861 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 10:01:26.379623 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.379514 2577 manager.go:217] Machine: {Timestamp:2026-04-20 10:01:26.377351177 +0000 UTC m=+0.398197392 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099252 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24831154b0cd6d61abb8541031c06b SystemUUID:ec248311-54b0-cd6d-61ab-b8541031c06b BootID:6ecd9219-752f-4193-bb9d-6a6b2f19c0d2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:23:1d:64:21:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:23:1d:64:21:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:38:a6:78:be:fa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 10:01:26.379623 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.379619 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 10:01:26.379746 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.379711 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 10:01:26.382433 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.382409 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 10:01:26.382567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.382436 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-148.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 10:01:26.382615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.382577 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 10:01:26.382615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.382584 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 10:01:26.382615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.382597 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:26.383416 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.383405 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 10:01:26.385268 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.385258 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:26.385377 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.385368 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 10:01:26.388128 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.388118 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 10:01:26.388180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.388136 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 10:01:26.388180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.388147 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 10:01:26.388180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.388156 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 10:01:26.388180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.388164 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 10:01:26.389275 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.389263 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:26.389326 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.389282 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 10:01:26.392617 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.392601 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 10:01:26.394420 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.394404 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 10:01:26.396603 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396588 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396610 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396619 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396627 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396635 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396643 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396651 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396659 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396668 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396676 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 10:01:26.396700 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396700 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 10:01:26.396994 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.396714 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 10:01:26.397565 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.397549 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qt4cc" Apr 20 10:01:26.397763 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.397751 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 10:01:26.397817 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.397767 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 10:01:26.398593 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.398568 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 10:01:26.398904 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.398873 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-148.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 10:01:26.401300 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401287 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 10:01:26.401360 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401340 2577 server.go:1295] "Started kubelet" Apr 20 10:01:26.401457 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401430 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 10:01:26.401538 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401482 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 10:01:26.401589 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401539 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 10:01:26.401848 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.401831 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-148.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 10:01:26.402008 ip-10-0-138-148 systemd[1]: Started Kubernetes Kubelet. Apr 20 10:01:26.403003 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.402988 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 10:01:26.404512 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.404496 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 10:01:26.404625 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.404531 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qt4cc" Apr 20 10:01:26.411292 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.411262 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 10:01:26.413380 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.413365 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 10:01:26.413380 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.413376 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:26.414043 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414025 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 10:01:26.414043 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414034 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 10:01:26.414043 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414047 2577 factory.go:55] Registering systemd factory Apr 20 10:01:26.414216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414055 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 10:01:26.414216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414045 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 10:01:26.414216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414025 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 10:01:26.414216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414151 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 10:01:26.414216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414160 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 10:01:26.414512 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414247 2577 factory.go:153] Registering CRI-O factory Apr 20 10:01:26.414512 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414261 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 10:01:26.414512 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414285 2577 factory.go:103] Registering Raw factory Apr 20 10:01:26.414512 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414297 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 10:01:26.414512 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.414370 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.414775 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.414754 2577 manager.go:319] Starting recovery of all containers Apr 20 10:01:26.416012 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.415960 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:26.418768 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.418745 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-148.ec2.internal\" not found" node="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.424585 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.424568 2577 manager.go:324] Recovery completed Apr 20 10:01:26.429042 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.429027 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.431750 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.431735 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.431832 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.431766 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.431832 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.431781 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.432218 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.432206 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 10:01:26.432285 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.432219 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 10:01:26.432285 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.432256 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 10:01:26.436187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.436173 2577 policy_none.go:49] "None policy: Start" Apr 20 10:01:26.436254 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.436191 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 10:01:26.436254 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.436204 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471402 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.471432 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471442 2577 server.go:85] "Starting device plugin registration server" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471690 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471703 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471805 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471870 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.471878 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.472385 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 10:01:26.489047 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.472435 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.532357 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.532331 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 10:01:26.533460 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.533444 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 10:01:26.533520 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.533474 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 10:01:26.533520 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.533512 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 10:01:26.533604 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.533521 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 10:01:26.533604 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.533584 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 10:01:26.536167 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.536128 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:26.572087 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.572065 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.575471 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.575457 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.575541 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.575482 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.575541 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.575493 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.575541 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.575511 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.587497 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.587470 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.587497 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.587498 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-148.ec2.internal\": node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.610039 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.610022 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.633894 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.633862 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal"] Apr 20 10:01:26.633992 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.633928 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.634590 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.634576 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.634649 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.634599 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.634649 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.634609 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.636757 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.636745 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.636901 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.636885 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.636951 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.636915 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.637396 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637380 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.637466 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637403 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.637466 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637426 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.637466 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637436 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.637600 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637407 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.637600 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.637505 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.639604 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.639588 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.639685 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.639617 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 10:01:26.640264 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.640249 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientMemory" Apr 20 10:01:26.640349 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.640274 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 10:01:26.640349 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.640284 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeHasSufficientPID" Apr 20 10:01:26.666874 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.666856 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-148.ec2.internal\" not found" node="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.671084 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.671067 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-148.ec2.internal\" not found" node="ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.710389 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.710370 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.715234 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.715218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.715341 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.715248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.715341 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.715280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3678d8120bbc28bfdc9a8f678b7b7df-config\") pod \"kube-apiserver-proxy-ip-10-0-138-148.ec2.internal\" (UID: \"e3678d8120bbc28bfdc9a8f678b7b7df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.811389 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.811336 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.815630 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.815716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.815716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3678d8120bbc28bfdc9a8f678b7b7df-config\") pod \"kube-apiserver-proxy-ip-10-0-138-148.ec2.internal\" (UID: \"e3678d8120bbc28bfdc9a8f678b7b7df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.815802 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.815802 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25e88e8c83637a43efaa5aa79e7847b0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal\" (UID: \"25e88e8c83637a43efaa5aa79e7847b0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.815802 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.815725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3678d8120bbc28bfdc9a8f678b7b7df-config\") pod \"kube-apiserver-proxy-ip-10-0-138-148.ec2.internal\" (UID: \"e3678d8120bbc28bfdc9a8f678b7b7df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.911997 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:26.911972 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:26.968451 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.968431 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:26.973975 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:26.973956 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:27.012584 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.012564 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.113070 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.113016 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.213531 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.213499 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.308950 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.308924 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 10:01:27.309650 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.309038 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 10:01:27.309650 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.309085 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 10:01:27.314085 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.314063 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.407757 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.407577 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 09:56:26 +0000 UTC" deadline="2027-10-31 01:36:39.563281524 +0000 UTC" Apr 20 10:01:27.407865 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.407760 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13407h35m12.155527284s" Apr 20 10:01:27.413457 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.413439 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 10:01:27.414429 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.414413 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.423541 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.423524 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 10:01:27.448516 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.448496 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bstc7" Apr 20 10:01:27.455486 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.455470 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bstc7" Apr 20 10:01:27.514383 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:27.514358 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3678d8120bbc28bfdc9a8f678b7b7df.slice/crio-ab64346e8796629c703209e5cc4a9d55770b7cc541fc643aa2c1ed43335408b3 WatchSource:0}: Error finding container ab64346e8796629c703209e5cc4a9d55770b7cc541fc643aa2c1ed43335408b3: Status 404 returned error can't find the container with id ab64346e8796629c703209e5cc4a9d55770b7cc541fc643aa2c1ed43335408b3 Apr 20 10:01:27.514943 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.514923 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.515463 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:27.515443 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e88e8c83637a43efaa5aa79e7847b0.slice/crio-e3eaa0f386b996c07213689a30257c99611fc6352ff6d2eef9f7e1da1735ec7f WatchSource:0}: Error finding container e3eaa0f386b996c07213689a30257c99611fc6352ff6d2eef9f7e1da1735ec7f: Status 404 returned error can't find the container with id e3eaa0f386b996c07213689a30257c99611fc6352ff6d2eef9f7e1da1735ec7f Apr 20 10:01:27.518853 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.518835 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:01:27.536331 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.536280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" event={"ID":"e3678d8120bbc28bfdc9a8f678b7b7df","Type":"ContainerStarted","Data":"ab64346e8796629c703209e5cc4a9d55770b7cc541fc643aa2c1ed43335408b3"} Apr 20 10:01:27.537132 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.537110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" event={"ID":"25e88e8c83637a43efaa5aa79e7847b0","Type":"ContainerStarted","Data":"e3eaa0f386b996c07213689a30257c99611fc6352ff6d2eef9f7e1da1735ec7f"} Apr 20 10:01:27.615409 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.615390 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.715932 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.715871 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.734200 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.734176 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:27.816847 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:27.816824 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-148.ec2.internal\" not found" Apr 20 10:01:27.823972 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.823954 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:27.913830 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.913806 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" Apr 20 10:01:27.925848 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.925823 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:27.926669 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.926648 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" Apr 20 10:01:27.939563 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:27.939544 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 10:01:28.338425 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.338396 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:28.389124 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.389101 2577 apiserver.go:52] "Watching apiserver" Apr 20 10:01:28.395666 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.395645 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 10:01:28.397354 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.397329 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4htjt","openshift-ovn-kubernetes/ovnkube-node-zwdv7","kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal","openshift-cluster-node-tuning-operator/tuned-ldc77","openshift-image-registry/node-ca-4467h","openshift-network-diagnostics/network-check-target-b258f","openshift-network-operator/iptables-alerter-lcfcx","kube-system/konnectivity-agent-78fm5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2","openshift-dns/node-resolver-qlvb7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal","openshift-multus/multus-additional-cni-plugins-j7mgk","openshift-multus/multus-vq9gr"] Apr 20 10:01:28.399788 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.399767 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.402483 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.402092 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.402483 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.402120 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.402483 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.402177 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 10:01:28.402483 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.402120 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.402483 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.402210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-n6kl2\"" Apr 20 10:01:28.404400 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.404380 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 10:01:28.404628 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.404613 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 10:01:28.404983 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.404856 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 10:01:28.404983 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.404913 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.405173 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.405157 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-w9fd6\"" Apr 20 10:01:28.405239 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.405173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.405413 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.405395 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 10:01:28.407333 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.407024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.407333 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.407153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.408833 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.408806 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.408930 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.408861 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.408930 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.408884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-sf22m\"" Apr 20 10:01:28.408930 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.408888 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.409415 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.409250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 10:01:28.409415 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.409339 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mbqcm\"" Apr 20 10:01:28.409415 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.409391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:28.409614 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.409468 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:28.409670 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.409644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.411888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.411869 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.412011 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.411986 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:28.414151 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.414132 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.415986 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.415967 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 10:01:28.416065 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.415973 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cnnxf\"" Apr 20 10:01:28.416065 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.416006 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 10:01:28.416571 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.416551 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.418496 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.418469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.418661 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.418641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-n4zld\"" Apr 20 10:01:28.418732 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.418676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 10:01:28.418786 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.418776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.419079 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.419055 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.420561 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.420541 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.420840 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.420824 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mspqh\"" Apr 20 10:01:28.420997 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.420977 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.421597 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.421579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.422970 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.422952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-systemd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423057 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.422977 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-config\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423057 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13714523-2ce0-41a5-92d0-6d74b6f94cba-iptables-alerter-script\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.423057 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-run\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.423178 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423053 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:28.423178 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423092 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-registration-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.423178 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-device-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.423178 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423141 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13714523-2ce0-41a5-92d0-6d74b6f94cba-host-slash\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.423178 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwkk\" (UniqueName: \"kubernetes.io/projected/13714523-2ce0-41a5-92d0-6d74b6f94cba-kube-api-access-ccwkk\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysconfig\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-systemd-units\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-node-log\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34621a55-49e0-4ddf-85fb-fe957bb51987-host\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftdn\" (UniqueName: \"kubernetes.io/projected/aa479de0-842b-41a8-952f-4382abbdf250-kube-api-access-kftdn\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-conf\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-host\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423435 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-log-socket\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-script-lib\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34621a55-49e0-4ddf-85fb-fe957bb51987-serviceca\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-sys-fs\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423678 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423689 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423800 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zvgjl\"" Apr 20 10:01:28.423874 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-var-lib-kubelet\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423942 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-tuned\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-tmp\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423979 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.423989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-slash\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424012 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-netns\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-var-lib-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-etc-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-modprobe-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9613cd5-debf-4732-aa68-673d25ca0a6a-agent-certs\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424193 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-socket-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsnw\" (UniqueName: \"kubernetes.io/projected/1f12cab1-38db-4199-ab17-ed83ce13c27d-kube-api-access-vxsnw\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-kubelet\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424274 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-ovn\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.424419 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-bin\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-systemd\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-netd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-env-overrides\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424451 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6rl5\" (UniqueName: \"kubernetes.io/projected/55296336-b343-4c13-ad2f-c3ceff32fcfe-kube-api-access-n6rl5\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84j7\" (UniqueName: \"kubernetes.io/projected/34621a55-49e0-4ddf-85fb-fe957bb51987-kube-api-access-r84j7\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-kubernetes\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424543 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-sys\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-lib-modules\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfzf7\" (UniqueName: \"kubernetes.io/projected/f4ce2de7-ed89-497b-b795-9aa124b05d1c-kube-api-access-bfzf7\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.425171 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.424630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9613cd5-debf-4732-aa68-673d25ca0a6a-konnectivity-ca\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.425777 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.425757 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-cspcd\"" Apr 20 10:01:28.425858 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.425803 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 10:01:28.456207 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.456177 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:27 +0000 UTC" deadline="2028-01-26 03:54:28.435969756 +0000 UTC" Apr 20 10:01:28.456207 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.456207 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15497h52m59.979766164s" Apr 20 10:01:28.515257 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.515234 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 10:01:28.524984 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.524966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.525080 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.525080 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525026 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-os-release\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.525080 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525050 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-netns\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cnibin\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.525110 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-bin\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.525195 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:29.025158201 +0000 UTC m=+3.046004406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-bin\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-config\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-tmp\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-slash\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525320 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-etc-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-k8s-cni-cncf-io\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525372 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-modprobe-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-socket-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-etc-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525459 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-slash\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-env-overrides\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.525587 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-socket-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-hostroot\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525685 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-netd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6rl5\" (UniqueName: \"kubernetes.io/projected/55296336-b343-4c13-ad2f-c3ceff32fcfe-kube-api-access-n6rl5\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525730 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-modprobe-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/081211a7-2d42-49fc-b457-6cded43e3390-hosts-file\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-cni-netd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525739 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-lib-modules\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-config\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfzf7\" (UniqueName: \"kubernetes.io/projected/f4ce2de7-ed89-497b-b795-9aa124b05d1c-kube-api-access-bfzf7\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525909 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-env-overrides\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-lib-modules\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xs9n\" (UniqueName: \"kubernetes.io/projected/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-kube-api-access-6xs9n\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-cni-binary-copy\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.526063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.525993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-multus\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-registration-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-registration-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwkk\" (UniqueName: \"kubernetes.io/projected/13714523-2ce0-41a5-92d0-6d74b6f94cba-kube-api-access-ccwkk\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526266 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-node-log\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34621a55-49e0-4ddf-85fb-fe957bb51987-host\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34621a55-49e0-4ddf-85fb-fe957bb51987-host\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526358 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-node-log\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kftdn\" (UniqueName: \"kubernetes.io/projected/aa479de0-842b-41a8-952f-4382abbdf250-kube-api-access-kftdn\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526452 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-host\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-host\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-script-lib\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-os-release\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.526812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-socket-dir-parent\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-sys-fs\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-var-lib-kubelet\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-sys-fs\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-etc-kubernetes\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526935 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-tuned\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-netns\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-var-lib-kubelet\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-etc-selinux\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.526982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-var-lib-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527007 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-run-netns\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-var-lib-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-kubelet\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527156 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9613cd5-debf-4732-aa68-673d25ca0a6a-agent-certs\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.527672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsnw\" (UniqueName: \"kubernetes.io/projected/1f12cab1-38db-4199-ab17-ed83ce13c27d-kube-api-access-vxsnw\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovnkube-script-lib\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-kubelet\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-ovn\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527355 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-host-kubelet\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/081211a7-2d42-49fc-b457-6cded43e3390-tmp-dir\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-conf-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-daemon-config\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-multus-certs\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527470 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-ovn\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-systemd\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r84j7\" (UniqueName: \"kubernetes.io/projected/34621a55-49e0-4ddf-85fb-fe957bb51987-kube-api-access-r84j7\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-systemd\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527610 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznwj\" (UniqueName: \"kubernetes.io/projected/081211a7-2d42-49fc-b457-6cded43e3390-kube-api-access-vznwj\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527713 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-cnibin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-kubernetes\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-sys\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.528458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9613cd5-debf-4732-aa68-673d25ca0a6a-konnectivity-ca\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-systemd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.527986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgp8v\" (UniqueName: \"kubernetes.io/projected/93bf46da-c530-40ee-bced-9d0772cc84b7-kube-api-access-lgp8v\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13714523-2ce0-41a5-92d0-6d74b6f94cba-iptables-alerter-script\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-run\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-device-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528140 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13714523-2ce0-41a5-92d0-6d74b6f94cba-host-slash\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-systemd-units\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528174 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-kubernetes\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-system-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528215 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysconfig\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528238 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-system-cni-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-bin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-conf\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.529535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-log-socket\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-sys\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528379 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34621a55-49e0-4ddf-85fb-fe957bb51987-serviceca\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528489 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-d\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-systemd\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-run\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13714523-2ce0-41a5-92d0-6d74b6f94cba-iptables-alerter-script\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528767 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysconfig\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1f12cab1-38db-4199-ab17-ed83ce13c27d-device-dir\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34621a55-49e0-4ddf-85fb-fe957bb51987-serviceca\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13714523-2ce0-41a5-92d0-6d74b6f94cba-host-slash\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528846 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-systemd-units\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528881 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-log-socket\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/55296336-b343-4c13-ad2f-c3ceff32fcfe-run-openvswitch\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-sysctl-conf\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.528982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a9613cd5-debf-4732-aa68-673d25ca0a6a-konnectivity-ca\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.529710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/55296336-b343-4c13-ad2f-c3ceff32fcfe-ovn-node-metrics-cert\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.530089 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-tmp\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.530533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.530328 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4ce2de7-ed89-497b-b795-9aa124b05d1c-etc-tuned\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.531502 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.531412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a9613cd5-debf-4732-aa68-673d25ca0a6a-agent-certs\") pod \"konnectivity-agent-78fm5\" (UID: \"a9613cd5-debf-4732-aa68-673d25ca0a6a\") " pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.534601 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.534234 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:28.534601 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.534255 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:28.534601 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.534270 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:28.534601 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:28.534342 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:29.034325552 +0000 UTC m=+3.055171776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:28.536698 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.536672 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwkk\" (UniqueName: \"kubernetes.io/projected/13714523-2ce0-41a5-92d0-6d74b6f94cba-kube-api-access-ccwkk\") pod \"iptables-alerter-lcfcx\" (UID: \"13714523-2ce0-41a5-92d0-6d74b6f94cba\") " pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.537779 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.537755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftdn\" (UniqueName: \"kubernetes.io/projected/aa479de0-842b-41a8-952f-4382abbdf250-kube-api-access-kftdn\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:28.537974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.537947 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84j7\" (UniqueName: \"kubernetes.io/projected/34621a55-49e0-4ddf-85fb-fe957bb51987-kube-api-access-r84j7\") pod \"node-ca-4467h\" (UID: \"34621a55-49e0-4ddf-85fb-fe957bb51987\") " pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.538052 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.537998 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6rl5\" (UniqueName: \"kubernetes.io/projected/55296336-b343-4c13-ad2f-c3ceff32fcfe-kube-api-access-n6rl5\") pod \"ovnkube-node-zwdv7\" (UID: \"55296336-b343-4c13-ad2f-c3ceff32fcfe\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.538052 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.538044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsnw\" (UniqueName: \"kubernetes.io/projected/1f12cab1-38db-4199-ab17-ed83ce13c27d-kube-api-access-vxsnw\") pod \"aws-ebs-csi-driver-node-9ttl2\" (UID: \"1f12cab1-38db-4199-ab17-ed83ce13c27d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.539116 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.539087 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfzf7\" (UniqueName: \"kubernetes.io/projected/f4ce2de7-ed89-497b-b795-9aa124b05d1c-kube-api-access-bfzf7\") pod \"tuned-ldc77\" (UID: \"f4ce2de7-ed89-497b-b795-9aa124b05d1c\") " pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.628981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.628907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-k8s-cni-cncf-io\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.628981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.628940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.628981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.628962 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-hostroot\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.628981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.628979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/081211a7-2d42-49fc-b457-6cded43e3390-hosts-file\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xs9n\" (UniqueName: \"kubernetes.io/projected/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-kube-api-access-6xs9n\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-cni-binary-copy\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629030 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-k8s-cni-cncf-io\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629053 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-multus\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-hostroot\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-multus\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629122 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/081211a7-2d42-49fc-b457-6cded43e3390-hosts-file\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-os-release\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-socket-dir-parent\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-etc-kubernetes\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-kubelet\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/081211a7-2d42-49fc-b457-6cded43e3390-tmp-dir\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629405 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-socket-dir-parent\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629423 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-etc-kubernetes\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629466 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-conf-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629475 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-os-release\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629505 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-daemon-config\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629515 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-kubelet\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-multus-certs\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-conf-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vznwj\" (UniqueName: \"kubernetes.io/projected/081211a7-2d42-49fc-b457-6cded43e3390-kube-api-access-vznwj\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-cnibin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cni-binary-copy\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/081211a7-2d42-49fc-b457-6cded43e3390-tmp-dir\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.629765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgp8v\" (UniqueName: \"kubernetes.io/projected/93bf46da-c530-40ee-bced-9d0772cc84b7-kube-api-access-lgp8v\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-system-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-cnibin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629721 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-system-cni-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629736 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-bin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629744 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-multus-certs\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-system-cni-dir\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-var-lib-cni-bin\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-os-release\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629835 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-netns\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cnibin\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-os-release\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629884 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629782 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-system-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-cni-dir\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/93bf46da-c530-40ee-bced-9d0772cc84b7-host-run-netns\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.630583 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-cnibin\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.631476 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.629965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-multus-daemon-config\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.631476 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.630121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/93bf46da-c530-40ee-bced-9d0772cc84b7-cni-binary-copy\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.631476 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.630319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.639668 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.639642 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xs9n\" (UniqueName: \"kubernetes.io/projected/f8dc4a94-baed-4ba8-8d13-6d45c52751d3-kube-api-access-6xs9n\") pod \"multus-additional-cni-plugins-j7mgk\" (UID: \"f8dc4a94-baed-4ba8-8d13-6d45c52751d3\") " pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.639807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.639790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznwj\" (UniqueName: \"kubernetes.io/projected/081211a7-2d42-49fc-b457-6cded43e3390-kube-api-access-vznwj\") pod \"node-resolver-qlvb7\" (UID: \"081211a7-2d42-49fc-b457-6cded43e3390\") " pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.639989 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.639967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgp8v\" (UniqueName: \"kubernetes.io/projected/93bf46da-c530-40ee-bced-9d0772cc84b7-kube-api-access-lgp8v\") pod \"multus-vq9gr\" (UID: \"93bf46da-c530-40ee-bced-9d0772cc84b7\") " pod="openshift-multus/multus-vq9gr" Apr 20 10:01:28.715788 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.715760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-lcfcx" Apr 20 10:01:28.723466 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.723448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:28.732001 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.731979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ldc77" Apr 20 10:01:28.736580 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.736564 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4467h" Apr 20 10:01:28.740030 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.740011 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 10:01:28.742846 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.742825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:28.750451 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.750432 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" Apr 20 10:01:28.757895 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.757875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qlvb7" Apr 20 10:01:28.764387 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.764371 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" Apr 20 10:01:28.769034 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:28.769015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vq9gr" Apr 20 10:01:29.032385 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.032356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:29.032563 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.032503 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:29.032630 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.032570 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:30.032547333 +0000 UTC m=+4.053393535 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:29.133146 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.133122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:29.133261 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.133244 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:29.133334 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.133266 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:29.133334 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.133278 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:29.133413 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.133361 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:30.133343452 +0000 UTC m=+4.154189655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:29.139456 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.139433 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4ce2de7_ed89_497b_b795_9aa124b05d1c.slice/crio-e477233f2881a21f7f0c80b26978e0c0c8ee07697fbc64f9a0040ca274849d3a WatchSource:0}: Error finding container e477233f2881a21f7f0c80b26978e0c0c8ee07697fbc64f9a0040ca274849d3a: Status 404 returned error can't find the container with id e477233f2881a21f7f0c80b26978e0c0c8ee07697fbc64f9a0040ca274849d3a Apr 20 10:01:29.140532 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.140508 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93bf46da_c530_40ee_bced_9d0772cc84b7.slice/crio-a3a8b80f700200332e31c1dd955d319ec381d1e3f80daaf61a2bf7785af5c8f4 WatchSource:0}: Error finding container a3a8b80f700200332e31c1dd955d319ec381d1e3f80daaf61a2bf7785af5c8f4: Status 404 returned error can't find the container with id a3a8b80f700200332e31c1dd955d319ec381d1e3f80daaf61a2bf7785af5c8f4 Apr 20 10:01:29.143099 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.142997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f12cab1_38db_4199_ab17_ed83ce13c27d.slice/crio-5f1cf4baf61c6eeaf48bb4f83b5d973487fa44b35ce458c3ae803fad3d536539 WatchSource:0}: Error finding container 5f1cf4baf61c6eeaf48bb4f83b5d973487fa44b35ce458c3ae803fad3d536539: Status 404 returned error can't find the container with id 5f1cf4baf61c6eeaf48bb4f83b5d973487fa44b35ce458c3ae803fad3d536539 Apr 20 10:01:29.147724 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.147703 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13714523_2ce0_41a5_92d0_6d74b6f94cba.slice/crio-63362aaba6f5d92216f371b52616dd2311be74834d1af3baeecab107d9d5c210 WatchSource:0}: Error finding container 63362aaba6f5d92216f371b52616dd2311be74834d1af3baeecab107d9d5c210: Status 404 returned error can't find the container with id 63362aaba6f5d92216f371b52616dd2311be74834d1af3baeecab107d9d5c210 Apr 20 10:01:29.148823 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.148799 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8dc4a94_baed_4ba8_8d13_6d45c52751d3.slice/crio-1ddf62514438099a9042fbb1d0f2ef8a9b61a4733cbcccf9ac9308abc9216849 WatchSource:0}: Error finding container 1ddf62514438099a9042fbb1d0f2ef8a9b61a4733cbcccf9ac9308abc9216849: Status 404 returned error can't find the container with id 1ddf62514438099a9042fbb1d0f2ef8a9b61a4733cbcccf9ac9308abc9216849 Apr 20 10:01:29.149513 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.149424 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55296336_b343_4c13_ad2f_c3ceff32fcfe.slice/crio-c6fbd3e158c507efbf5858fa6dd930a965afbf4ea63c45e7a6fb8604a03e59cc WatchSource:0}: Error finding container c6fbd3e158c507efbf5858fa6dd930a965afbf4ea63c45e7a6fb8604a03e59cc: Status 404 returned error can't find the container with id c6fbd3e158c507efbf5858fa6dd930a965afbf4ea63c45e7a6fb8604a03e59cc Apr 20 10:01:29.151245 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.151224 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod081211a7_2d42_49fc_b457_6cded43e3390.slice/crio-480e77d8ffa3d7ab382eb0a3e69f23884387601c7efe6448c0e03c39609526bb WatchSource:0}: Error finding container 480e77d8ffa3d7ab382eb0a3e69f23884387601c7efe6448c0e03c39609526bb: Status 404 returned error can't find the container with id 480e77d8ffa3d7ab382eb0a3e69f23884387601c7efe6448c0e03c39609526bb Apr 20 10:01:29.151657 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.151636 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9613cd5_debf_4732_aa68_673d25ca0a6a.slice/crio-55c34d3abc337963292149d2804aca25944c373b6bf3a50412d20174cb748fe6 WatchSource:0}: Error finding container 55c34d3abc337963292149d2804aca25944c373b6bf3a50412d20174cb748fe6: Status 404 returned error can't find the container with id 55c34d3abc337963292149d2804aca25944c373b6bf3a50412d20174cb748fe6 Apr 20 10:01:29.151960 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:01:29.151939 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34621a55_49e0_4ddf_85fb_fe957bb51987.slice/crio-2dc1ff47dc3fab1e1c14e5626f6debebd8a01cb159dbc0f71dc2c3567331dd25 WatchSource:0}: Error finding container 2dc1ff47dc3fab1e1c14e5626f6debebd8a01cb159dbc0f71dc2c3567331dd25: Status 404 returned error can't find the container with id 2dc1ff47dc3fab1e1c14e5626f6debebd8a01cb159dbc0f71dc2c3567331dd25 Apr 20 10:01:29.457226 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.457151 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 09:56:27 +0000 UTC" deadline="2027-10-21 08:03:59.166274526 +0000 UTC" Apr 20 10:01:29.457226 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.457188 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13174h2m29.709090455s" Apr 20 10:01:29.534491 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.534465 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:29.534636 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:29.534605 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:29.550894 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.550161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" event={"ID":"e3678d8120bbc28bfdc9a8f678b7b7df","Type":"ContainerStarted","Data":"8de3db1af2d2f2aad7692776dded558a693cb93648ea7989659e567ce9d4c712"} Apr 20 10:01:29.558831 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.558798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qlvb7" event={"ID":"081211a7-2d42-49fc-b457-6cded43e3390","Type":"ContainerStarted","Data":"480e77d8ffa3d7ab382eb0a3e69f23884387601c7efe6448c0e03c39609526bb"} Apr 20 10:01:29.566549 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.566517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lcfcx" event={"ID":"13714523-2ce0-41a5-92d0-6d74b6f94cba","Type":"ContainerStarted","Data":"63362aaba6f5d92216f371b52616dd2311be74834d1af3baeecab107d9d5c210"} Apr 20 10:01:29.567505 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.567458 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-148.ec2.internal" podStartSLOduration=2.567443463 podStartE2EDuration="2.567443463s" podCreationTimestamp="2026-04-20 10:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:29.566589692 +0000 UTC m=+3.587435920" watchObservedRunningTime="2026-04-20 10:01:29.567443463 +0000 UTC m=+3.588289704" Apr 20 10:01:29.567986 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.567962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" event={"ID":"1f12cab1-38db-4199-ab17-ed83ce13c27d","Type":"ContainerStarted","Data":"5f1cf4baf61c6eeaf48bb4f83b5d973487fa44b35ce458c3ae803fad3d536539"} Apr 20 10:01:29.571488 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.571465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldc77" event={"ID":"f4ce2de7-ed89-497b-b795-9aa124b05d1c","Type":"ContainerStarted","Data":"e477233f2881a21f7f0c80b26978e0c0c8ee07697fbc64f9a0040ca274849d3a"} Apr 20 10:01:29.573653 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.573609 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerStarted","Data":"1ddf62514438099a9042fbb1d0f2ef8a9b61a4733cbcccf9ac9308abc9216849"} Apr 20 10:01:29.579348 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.579187 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4467h" event={"ID":"34621a55-49e0-4ddf-85fb-fe957bb51987","Type":"ContainerStarted","Data":"2dc1ff47dc3fab1e1c14e5626f6debebd8a01cb159dbc0f71dc2c3567331dd25"} Apr 20 10:01:29.583706 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.583650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-78fm5" event={"ID":"a9613cd5-debf-4732-aa68-673d25ca0a6a","Type":"ContainerStarted","Data":"55c34d3abc337963292149d2804aca25944c373b6bf3a50412d20174cb748fe6"} Apr 20 10:01:29.589798 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.589775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"c6fbd3e158c507efbf5858fa6dd930a965afbf4ea63c45e7a6fb8604a03e59cc"} Apr 20 10:01:29.598065 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:29.598011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vq9gr" event={"ID":"93bf46da-c530-40ee-bced-9d0772cc84b7","Type":"ContainerStarted","Data":"a3a8b80f700200332e31c1dd955d319ec381d1e3f80daaf61a2bf7785af5c8f4"} Apr 20 10:01:30.040526 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:30.040491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:30.040702 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.040635 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:30.040765 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.040702 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:32.040682282 +0000 UTC m=+6.061528498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:30.143423 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:30.142788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:30.143423 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.142982 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:30.143423 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.142999 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:30.143423 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.143012 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:30.143423 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.143064 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:32.143047899 +0000 UTC m=+6.163894108 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:30.536438 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:30.536407 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:30.536894 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:30.536539 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:30.624800 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:30.624727 2577 generic.go:358] "Generic (PLEG): container finished" podID="25e88e8c83637a43efaa5aa79e7847b0" containerID="467ea8e989da0a6bc9b4efd350a329540190b2a9e675ce187460f30070ec32b1" exitCode=0 Apr 20 10:01:30.625695 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:30.625668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" event={"ID":"25e88e8c83637a43efaa5aa79e7847b0","Type":"ContainerDied","Data":"467ea8e989da0a6bc9b4efd350a329540190b2a9e675ce187460f30070ec32b1"} Apr 20 10:01:31.534542 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:31.533834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:31.534542 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:31.534166 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:31.630624 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:31.630587 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" event={"ID":"25e88e8c83637a43efaa5aa79e7847b0","Type":"ContainerStarted","Data":"dab614021faeae242f2c205c348fd88a4619fade7405b6d2e098540973754307"} Apr 20 10:01:32.057616 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:32.057579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:32.057825 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.057729 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:32.057825 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.057795 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:36.057774691 +0000 UTC m=+10.078620907 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:32.158941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:32.158903 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:32.159090 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.159043 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:32.159090 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.159066 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:32.159090 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.159078 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:32.159240 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.159136 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:36.159117754 +0000 UTC m=+10.179963969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:32.534685 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:32.534658 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:32.534851 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:32.534824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:33.534074 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:33.534038 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:33.534546 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:33.534165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:34.534637 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:34.534512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:34.535091 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:34.534646 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:35.534055 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:35.534023 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:35.534246 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:35.534145 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:36.088892 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:36.088858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:36.089438 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.089067 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:36.089438 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.089132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:44.089112055 +0000 UTC m=+18.109958262 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:36.190238 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:36.190206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:36.190452 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.190417 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:36.190452 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.190439 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:36.190452 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.190453 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:36.190608 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.190514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:01:44.190495095 +0000 UTC m=+18.211341301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:36.535156 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:36.535017 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:36.535156 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:36.535138 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:37.534013 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:37.533980 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:37.534427 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:37.534076 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:38.534543 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:38.534515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:38.534940 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:38.534628 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:39.534547 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:39.534512 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:39.534993 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:39.534630 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:40.534185 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:40.534156 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:40.534396 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:40.534315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:41.534355 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:41.534324 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:41.534738 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:41.534419 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:42.534073 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:42.534041 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:42.534255 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:42.534189 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:43.534447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:43.534411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:43.534847 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:43.534526 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:44.153918 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:44.153886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:44.154072 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.154052 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:44.154146 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.154127 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:00.15410709 +0000 UTC m=+34.174953297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:01:44.254684 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:44.254649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:44.254864 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.254833 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:01:44.254864 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.254855 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:01:44.254968 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.254868 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:44.254968 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.254933 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:00.254912174 +0000 UTC m=+34.275758389 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:01:44.534703 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:44.534674 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:44.535119 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:44.534808 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:45.534456 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:45.534419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:45.534635 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:45.534535 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:46.535390 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.535191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:46.535882 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:46.535472 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:46.655482 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.655449 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="25d55779aec67863afd331be690bb5ff52f5b1fd9575eccc57a62caf374810b7" exitCode=0 Apr 20 10:01:46.655600 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.655530 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"25d55779aec67863afd331be690bb5ff52f5b1fd9575eccc57a62caf374810b7"} Apr 20 10:01:46.656906 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.656865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4467h" event={"ID":"34621a55-49e0-4ddf-85fb-fe957bb51987","Type":"ContainerStarted","Data":"7070420783cd4d784093180349298bd22fed621ce9c4ccbf5d3c17281abd4307"} Apr 20 10:01:46.658363 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.658338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-78fm5" event={"ID":"a9613cd5-debf-4732-aa68-673d25ca0a6a","Type":"ContainerStarted","Data":"2c41cb2c198e4e455053ee37be122a6892580c01530f99629766f476a9db86d2"} Apr 20 10:01:46.659724 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.659703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:01:46.660097 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.660073 2577 generic.go:358] "Generic (PLEG): container finished" podID="55296336-b343-4c13-ad2f-c3ceff32fcfe" containerID="3005f4a30a5f8e22f6228f241db19754f6b809393590276ce40bc274f7e11fa9" exitCode=1 Apr 20 10:01:46.660187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.660143 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerDied","Data":"3005f4a30a5f8e22f6228f241db19754f6b809393590276ce40bc274f7e11fa9"} Apr 20 10:01:46.660187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.660172 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"d28bd315b1ce73c1712c27e7493bd86131dc601c2763cfa5a78ae939aeff9435"} Apr 20 10:01:46.661518 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.661493 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vq9gr" event={"ID":"93bf46da-c530-40ee-bced-9d0772cc84b7","Type":"ContainerStarted","Data":"bd87979e1d57f0e9fa2d23ead33f6e77f498fec53b05f9e18cc6b10da35bcc22"} Apr 20 10:01:46.664600 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.664577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qlvb7" event={"ID":"081211a7-2d42-49fc-b457-6cded43e3390","Type":"ContainerStarted","Data":"c387fd7b58f3b84dc0792644bc44a62903d04f98c7b6357508dcadd868e2da20"} Apr 20 10:01:46.665748 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.665716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" event={"ID":"1f12cab1-38db-4199-ab17-ed83ce13c27d","Type":"ContainerStarted","Data":"18579667d1adebd198c4a4586d1d1202793b2cfeef35f785368fb21f4d30fb32"} Apr 20 10:01:46.668239 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.667264 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ldc77" event={"ID":"f4ce2de7-ed89-497b-b795-9aa124b05d1c","Type":"ContainerStarted","Data":"3305cb01fa1a58b4053067cfa1b19f99e5ec4a9b2b51b239f5ae74a6b19ba9b4"} Apr 20 10:01:46.689930 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.689621 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-148.ec2.internal" podStartSLOduration=19.689606834 podStartE2EDuration="19.689606834s" podCreationTimestamp="2026-04-20 10:01:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:01:31.648955699 +0000 UTC m=+5.669801927" watchObservedRunningTime="2026-04-20 10:01:46.689606834 +0000 UTC m=+20.710453058" Apr 20 10:01:46.705357 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.705298 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qlvb7" podStartSLOduration=3.6956229350000003 podStartE2EDuration="20.705283236s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.152880824 +0000 UTC m=+3.173727037" lastFinishedPulling="2026-04-20 10:01:46.162541123 +0000 UTC m=+20.183387338" observedRunningTime="2026-04-20 10:01:46.705282669 +0000 UTC m=+20.726128895" watchObservedRunningTime="2026-04-20 10:01:46.705283236 +0000 UTC m=+20.726129461" Apr 20 10:01:46.724796 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.724762 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vq9gr" podStartSLOduration=3.643093596 podStartE2EDuration="20.724751597s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.143576534 +0000 UTC m=+3.164422737" lastFinishedPulling="2026-04-20 10:01:46.225234536 +0000 UTC m=+20.246080738" observedRunningTime="2026-04-20 10:01:46.724649056 +0000 UTC m=+20.745495279" watchObservedRunningTime="2026-04-20 10:01:46.724751597 +0000 UTC m=+20.745597821" Apr 20 10:01:46.742484 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.742449 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ldc77" podStartSLOduration=3.6948665590000003 podStartE2EDuration="20.74243955s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.142196867 +0000 UTC m=+3.163043069" lastFinishedPulling="2026-04-20 10:01:46.189769842 +0000 UTC m=+20.210616060" observedRunningTime="2026-04-20 10:01:46.742091302 +0000 UTC m=+20.762937526" watchObservedRunningTime="2026-04-20 10:01:46.74243955 +0000 UTC m=+20.763285774" Apr 20 10:01:46.758190 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.758154 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-78fm5" podStartSLOduration=3.749433389 podStartE2EDuration="20.758142565s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.153690465 +0000 UTC m=+3.174536679" lastFinishedPulling="2026-04-20 10:01:46.162399652 +0000 UTC m=+20.183245855" observedRunningTime="2026-04-20 10:01:46.758115934 +0000 UTC m=+20.778962159" watchObservedRunningTime="2026-04-20 10:01:46.758142565 +0000 UTC m=+20.778988782" Apr 20 10:01:46.775637 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:46.775608 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4467h" podStartSLOduration=4.068124725 podStartE2EDuration="20.775598731s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.154656117 +0000 UTC m=+3.175502324" lastFinishedPulling="2026-04-20 10:01:45.862130124 +0000 UTC m=+19.882976330" observedRunningTime="2026-04-20 10:01:46.775533374 +0000 UTC m=+20.796379601" watchObservedRunningTime="2026-04-20 10:01:46.775598731 +0000 UTC m=+20.796444976" Apr 20 10:01:47.534637 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.534478 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:47.534748 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:47.534709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:47.670848 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.670810 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-lcfcx" event={"ID":"13714523-2ce0-41a5-92d0-6d74b6f94cba","Type":"ContainerStarted","Data":"1d3a21d3ce9169375d41db754606d72931277b1157a922388903d0465b1ff45e"} Apr 20 10:01:47.673698 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.673679 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:01:47.674125 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.674001 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"fb1ed41515b902b1d04a453e685fd9076ad9675d8e80c9d6c072d4fb2ed70617"} Apr 20 10:01:47.674125 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.674036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"9cca9c8ef834fad6bed6c189a27d2dac8fbefccf424d1bed7195ebffb92bf1e1"} Apr 20 10:01:47.674125 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.674050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"d31572c09f474223ae0f03a7620aa97177aa442d971837462d6e08ca5a4b2ece"} Apr 20 10:01:47.674125 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.674062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"3309a6b6db34dc5a5f95c50c08c47ab6919dc4ca788246752a57939bad78d51d"} Apr 20 10:01:47.686876 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.686841 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-lcfcx" podStartSLOduration=4.64672092 podStartE2EDuration="21.686828843s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.14966049 +0000 UTC m=+3.170506708" lastFinishedPulling="2026-04-20 10:01:46.189768415 +0000 UTC m=+20.210614631" observedRunningTime="2026-04-20 10:01:47.686749879 +0000 UTC m=+21.707596103" watchObservedRunningTime="2026-04-20 10:01:47.686828843 +0000 UTC m=+21.707675078" Apr 20 10:01:47.941819 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:47.941781 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 10:01:48.483425 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:48.483299 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T10:01:47.941801894Z","UUID":"a966bf7f-826e-4444-a269-871d9cef712a","Handler":null,"Name":"","Endpoint":""} Apr 20 10:01:48.486613 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:48.486229 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 10:01:48.486613 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:48.486249 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 10:01:48.534225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:48.534195 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:48.534383 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:48.534353 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:48.678410 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:48.678360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" event={"ID":"1f12cab1-38db-4199-ab17-ed83ce13c27d","Type":"ContainerStarted","Data":"2e3f571580ecd6a78f1f339acab9b2a476051a34cd1041d5e759fc3e83aaae5b"} Apr 20 10:01:49.534238 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:49.534204 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:49.534378 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:49.534355 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:49.682495 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:49.682428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" event={"ID":"1f12cab1-38db-4199-ab17-ed83ce13c27d","Type":"ContainerStarted","Data":"d1e9b47cb0cd005644b3a5d625f9b395728c27d4152b66fc33d93b98e16782ff"} Apr 20 10:01:49.700482 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:49.700437 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9ttl2" podStartSLOduration=3.6129798969999998 podStartE2EDuration="23.700420491s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.145357141 +0000 UTC m=+3.166203357" lastFinishedPulling="2026-04-20 10:01:49.23279775 +0000 UTC m=+23.253643951" observedRunningTime="2026-04-20 10:01:49.69981437 +0000 UTC m=+23.720660595" watchObservedRunningTime="2026-04-20 10:01:49.700420491 +0000 UTC m=+23.721266760" Apr 20 10:01:49.805999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:49.805973 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:50.534042 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:50.534012 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:50.534203 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:50.534153 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:51.369438 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.369210 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:51.369956 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.369939 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:51.534639 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.534617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:51.534734 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:51.534696 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:51.687056 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.687000 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="b05cf477eea51aefdb402fee26d4282d4ac473129a7ad707a83bb8158e4158d9" exitCode=0 Apr 20 10:01:51.687160 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.687072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"b05cf477eea51aefdb402fee26d4282d4ac473129a7ad707a83bb8158e4158d9"} Apr 20 10:01:51.690193 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.690173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:01:51.690607 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.690580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"c89f0f58056bc09dae1a66af0ae6af7cf763d53f2940f1bd76b37a77191b2198"} Apr 20 10:01:51.691189 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:51.691169 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-78fm5" Apr 20 10:01:52.534411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:52.534391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:52.534673 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:52.534484 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:52.694430 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:52.694372 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="803b8fbccb18506cebbce73851c03c599aba60e5326cd6c2e062e7621253b3ce" exitCode=0 Apr 20 10:01:52.694522 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:52.694462 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"803b8fbccb18506cebbce73851c03c599aba60e5326cd6c2e062e7621253b3ce"} Apr 20 10:01:53.534598 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.534565 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:53.534935 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:53.534677 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:53.699411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.699353 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:01:53.699827 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.699788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"013b5a0d89bfd05ec76eb13ae763eaf16ea72e2ef65b5da28edb75fa5ea24b5a"} Apr 20 10:01:53.700276 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.700247 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:53.700414 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.700289 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:53.700414 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.700292 2577 scope.go:117] "RemoveContainer" containerID="3005f4a30a5f8e22f6228f241db19754f6b809393590276ce40bc274f7e11fa9" Apr 20 10:01:53.700414 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.700322 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:53.702933 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.702902 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="8d85dc69673a23219df1248037a449edd68c6e170a17dd8dae91f55a1948abdf" exitCode=0 Apr 20 10:01:53.703095 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.702945 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"8d85dc69673a23219df1248037a449edd68c6e170a17dd8dae91f55a1948abdf"} Apr 20 10:01:53.716235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.716209 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:53.717756 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:53.717732 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:01:54.534637 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.534419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:54.535079 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:54.534751 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:54.709945 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.709868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:01:54.711081 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.710280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" event={"ID":"55296336-b343-4c13-ad2f-c3ceff32fcfe","Type":"ContainerStarted","Data":"3b5e953ab17decc4ce1f1da34c7fa77402325739ac0e22cae8bd72e9772dc25f"} Apr 20 10:01:54.914882 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.914698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" podStartSLOduration=11.796123565 podStartE2EDuration="28.914676707s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.152638036 +0000 UTC m=+3.173484237" lastFinishedPulling="2026-04-20 10:01:46.271191164 +0000 UTC m=+20.292037379" observedRunningTime="2026-04-20 10:01:54.762275261 +0000 UTC m=+28.783121478" watchObservedRunningTime="2026-04-20 10:01:54.914676707 +0000 UTC m=+28.935522932" Apr 20 10:01:54.917681 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.917612 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b258f"] Apr 20 10:01:54.917812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.917757 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:54.917885 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:54.917860 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:54.919288 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.919264 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4htjt"] Apr 20 10:01:54.919544 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:54.919399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:54.919544 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:54.919524 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:56.535126 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:56.535097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:56.535742 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:56.535184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:56.535742 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:56.535259 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:56.535742 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:56.535273 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:58.533780 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:58.533744 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:01:58.534251 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:58.533870 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b258f" podUID="340db265-d04c-46d7-b5b0-6141dced7313" Apr 20 10:01:58.534251 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:58.533947 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:01:58.534251 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:58.534035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4htjt" podUID="aa479de0-842b-41a8-952f-4382abbdf250" Apr 20 10:01:59.374626 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.374600 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-148.ec2.internal" event="NodeReady" Apr 20 10:01:59.374765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.374724 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 10:01:59.427383 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.427223 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-gngqn"] Apr 20 10:01:59.430114 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.430095 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.430434 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.430294 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f6m7h"] Apr 20 10:01:59.432818 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.432795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 10:01:59.433110 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.433084 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qvhbt\"" Apr 20 10:01:59.433239 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.433198 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.433360 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.433215 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 10:01:59.435151 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.435133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 10:01:59.435253 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.435134 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 10:01:59.435253 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.435159 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 10:01:59.435253 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.435180 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4xmrs\"" Apr 20 10:01:59.439716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.439694 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gngqn"] Apr 20 10:01:59.444916 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.444882 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f6m7h"] Apr 20 10:01:59.577025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.576976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.577282 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.577039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58441859-cdfd-435c-a17f-f77225ae4513-config-volume\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.577282 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.577079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78mvk\" (UniqueName: \"kubernetes.io/projected/033bb25f-40eb-4a7f-ad48-e552fca86c6d-kube-api-access-78mvk\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.577282 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.577132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.577282 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.577160 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddck\" (UniqueName: \"kubernetes.io/projected/58441859-cdfd-435c-a17f-f77225ae4513-kube-api-access-kddck\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.577282 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.577203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58441859-cdfd-435c-a17f-f77225ae4513-tmp-dir\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678249 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.678344 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58441859-cdfd-435c-a17f-f77225ae4513-config-volume\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678344 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78mvk\" (UniqueName: \"kubernetes.io/projected/033bb25f-40eb-4a7f-ad48-e552fca86c6d-kube-api-access-78mvk\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.678424 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678424 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:59.678403 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:01:59.678510 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:59.678480 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:01:59.678510 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:59.678498 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert podName:033bb25f-40eb-4a7f-ad48-e552fca86c6d nodeName:}" failed. No retries permitted until 2026-04-20 10:02:00.17847585 +0000 UTC m=+34.199322068 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert") pod "ingress-canary-f6m7h" (UID: "033bb25f-40eb-4a7f-ad48-e552fca86c6d") : secret "canary-serving-cert" not found Apr 20 10:01:59.678610 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:01:59.678515 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls podName:58441859-cdfd-435c-a17f-f77225ae4513 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:00.178507103 +0000 UTC m=+34.199353318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls") pod "dns-default-gngqn" (UID: "58441859-cdfd-435c-a17f-f77225ae4513") : secret "dns-default-metrics-tls" not found Apr 20 10:01:59.678610 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kddck\" (UniqueName: \"kubernetes.io/projected/58441859-cdfd-435c-a17f-f77225ae4513-kube-api-access-kddck\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678610 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678547 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58441859-cdfd-435c-a17f-f77225ae4513-tmp-dir\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678846 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/58441859-cdfd-435c-a17f-f77225ae4513-tmp-dir\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.678886 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.678861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58441859-cdfd-435c-a17f-f77225ae4513-config-volume\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.688531 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.688507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddck\" (UniqueName: \"kubernetes.io/projected/58441859-cdfd-435c-a17f-f77225ae4513-kube-api-access-kddck\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:01:59.688642 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.688561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78mvk\" (UniqueName: \"kubernetes.io/projected/033bb25f-40eb-4a7f-ad48-e552fca86c6d-kube-api-access-78mvk\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:01:59.721077 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.721054 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="2085b13906a86483a6669be4b0cad27e7cc14efabd7d9114cf88cac1549f71a1" exitCode=0 Apr 20 10:01:59.721163 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:01:59.721092 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"2085b13906a86483a6669be4b0cad27e7cc14efabd7d9114cf88cac1549f71a1"} Apr 20 10:02:00.181487 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.181422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:02:00.181487 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.181469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:00.181639 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.181501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:00.181639 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181556 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:02:00.181639 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181579 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:00.181639 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181616 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:00.181639 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181631 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs podName:aa479de0-842b-41a8-952f-4382abbdf250 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:32.181609349 +0000 UTC m=+66.202455559 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs") pod "network-metrics-daemon-4htjt" (UID: "aa479de0-842b-41a8-952f-4382abbdf250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 10:02:00.181805 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181653 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert podName:033bb25f-40eb-4a7f-ad48-e552fca86c6d nodeName:}" failed. No retries permitted until 2026-04-20 10:02:01.181641621 +0000 UTC m=+35.202487827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert") pod "ingress-canary-f6m7h" (UID: "033bb25f-40eb-4a7f-ad48-e552fca86c6d") : secret "canary-serving-cert" not found Apr 20 10:02:00.181805 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.181727 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls podName:58441859-cdfd-435c-a17f-f77225ae4513 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:01.181715991 +0000 UTC m=+35.202562206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls") pod "dns-default-gngqn" (UID: "58441859-cdfd-435c-a17f-f77225ae4513") : secret "dns-default-metrics-tls" not found Apr 20 10:02:00.282398 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.282368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:00.282518 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.282498 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 10:02:00.282518 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.282514 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 10:02:00.282600 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.282524 2577 projected.go:194] Error preparing data for projected volume kube-api-access-56v2q for pod openshift-network-diagnostics/network-check-target-b258f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:02:00.282600 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:00.282567 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q podName:340db265-d04c-46d7-b5b0-6141dced7313 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:32.282554362 +0000 UTC m=+66.303400564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-56v2q" (UniqueName: "kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q") pod "network-check-target-b258f" (UID: "340db265-d04c-46d7-b5b0-6141dced7313") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 10:02:00.534718 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.534691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:00.534834 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.534722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:02:00.537142 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.537122 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 10:02:00.537551 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.537528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:02:00.537551 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.537543 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 10:02:00.537732 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.537545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-plq86\"" Apr 20 10:02:00.537732 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.537545 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 10:02:00.725929 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.725902 2577 generic.go:358] "Generic (PLEG): container finished" podID="f8dc4a94-baed-4ba8-8d13-6d45c52751d3" containerID="c6598989167fbee4df02fb631a65282a8b4d97835d39901c789dc2a09849da02" exitCode=0 Apr 20 10:02:00.726241 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.725962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerDied","Data":"c6598989167fbee4df02fb631a65282a8b4d97835d39901c789dc2a09849da02"} Apr 20 10:02:00.802633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.802611 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5xc8h"] Apr 20 10:02:00.806693 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.806677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.808785 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.808759 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 10:02:00.808785 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.808760 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 10:02:00.809004 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.808851 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 10:02:00.809150 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.809136 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 10:02:00.809220 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.809202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-sg9v6\"" Apr 20 10:02:00.816244 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.816218 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5xc8h"] Apr 20 10:02:00.886815 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.886794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-key\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.886928 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.886819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-cabundle\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.886928 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.886839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-794pj\" (UniqueName: \"kubernetes.io/projected/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-kube-api-access-794pj\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.987088 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.987064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-key\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.987168 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.987092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-cabundle\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.987168 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.987107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-794pj\" (UniqueName: \"kubernetes.io/projected/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-kube-api-access-794pj\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.987826 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.987807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-cabundle\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.989602 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.989586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-signing-key\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:00.999765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:00.999746 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-794pj\" (UniqueName: \"kubernetes.io/projected/9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b-kube-api-access-794pj\") pod \"service-ca-865cb79987-5xc8h\" (UID: \"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b\") " pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:01.123746 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.123694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5xc8h" Apr 20 10:02:01.188477 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.188449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:01.188594 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.188520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:01.188651 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:01.188608 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:01.188714 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:01.188663 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls podName:58441859-cdfd-435c-a17f-f77225ae4513 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:03.188650311 +0000 UTC m=+37.209496513 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls") pod "dns-default-gngqn" (UID: "58441859-cdfd-435c-a17f-f77225ae4513") : secret "dns-default-metrics-tls" not found Apr 20 10:02:01.188714 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:01.188608 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:01.188830 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:01.188747 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert podName:033bb25f-40eb-4a7f-ad48-e552fca86c6d nodeName:}" failed. No retries permitted until 2026-04-20 10:02:03.188734007 +0000 UTC m=+37.209580223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert") pod "ingress-canary-f6m7h" (UID: "033bb25f-40eb-4a7f-ad48-e552fca86c6d") : secret "canary-serving-cert" not found Apr 20 10:02:01.252182 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.252153 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5xc8h"] Apr 20 10:02:01.256700 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:01.256674 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e0bf27c_dbd8_4449_b3f8_9a2e7c0ac56b.slice/crio-e54be91077ac6baea101ca862e7826ab938a45fed65d222ff7643e81b5be146c WatchSource:0}: Error finding container e54be91077ac6baea101ca862e7826ab938a45fed65d222ff7643e81b5be146c: Status 404 returned error can't find the container with id e54be91077ac6baea101ca862e7826ab938a45fed65d222ff7643e81b5be146c Apr 20 10:02:01.489400 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.489297 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qlvb7_081211a7-2d42-49fc-b457-6cded43e3390/dns-node-resolver/0.log" Apr 20 10:02:01.729267 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.729237 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5xc8h" event={"ID":"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b","Type":"ContainerStarted","Data":"e54be91077ac6baea101ca862e7826ab938a45fed65d222ff7643e81b5be146c"} Apr 20 10:02:01.732459 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.732436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" event={"ID":"f8dc4a94-baed-4ba8-8d13-6d45c52751d3","Type":"ContainerStarted","Data":"10779c32c0378179d745b10b9129c61bd6a22e28bde931cb3e4b815f67555665"} Apr 20 10:02:01.761467 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:01.761427 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j7mgk" podStartSLOduration=5.60447853 podStartE2EDuration="35.761416099s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:01:29.151697924 +0000 UTC m=+3.172544142" lastFinishedPulling="2026-04-20 10:01:59.308635508 +0000 UTC m=+33.329481711" observedRunningTime="2026-04-20 10:02:01.760794699 +0000 UTC m=+35.781640923" watchObservedRunningTime="2026-04-20 10:02:01.761416099 +0000 UTC m=+35.782262353" Apr 20 10:02:02.256344 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:02.256151 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4467h_34621a55-49e0-4ddf-85fb-fe957bb51987/node-ca/0.log" Apr 20 10:02:03.202488 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:03.202454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:03.202881 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:03.202614 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:03.202881 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:03.202666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:03.202881 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:03.202675 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert podName:033bb25f-40eb-4a7f-ad48-e552fca86c6d nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.20266131 +0000 UTC m=+41.223507511 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert") pod "ingress-canary-f6m7h" (UID: "033bb25f-40eb-4a7f-ad48-e552fca86c6d") : secret "canary-serving-cert" not found Apr 20 10:02:03.202881 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:03.202788 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:03.202881 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:03.202832 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls podName:58441859-cdfd-435c-a17f-f77225ae4513 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:07.20282037 +0000 UTC m=+41.223666573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls") pod "dns-default-gngqn" (UID: "58441859-cdfd-435c-a17f-f77225ae4513") : secret "dns-default-metrics-tls" not found Apr 20 10:02:03.736902 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:03.736835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5xc8h" event={"ID":"9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b","Type":"ContainerStarted","Data":"5fdfa654e6582a22273396b5bf018bb7e9e6b01bb459cadbdc054999bd04532b"} Apr 20 10:02:03.754612 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:03.754571 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5xc8h" podStartSLOduration=1.6044267570000001 podStartE2EDuration="3.754558306s" podCreationTimestamp="2026-04-20 10:02:00 +0000 UTC" firstStartedPulling="2026-04-20 10:02:01.258552959 +0000 UTC m=+35.279399165" lastFinishedPulling="2026-04-20 10:02:03.408684507 +0000 UTC m=+37.429530714" observedRunningTime="2026-04-20 10:02:03.753610178 +0000 UTC m=+37.774456401" watchObservedRunningTime="2026-04-20 10:02:03.754558306 +0000 UTC m=+37.775404531" Apr 20 10:02:07.230393 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:07.230358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:07.231006 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:07.230421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:07.231006 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:07.230482 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 10:02:07.231006 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:07.230535 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls podName:58441859-cdfd-435c-a17f-f77225ae4513 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:15.230520781 +0000 UTC m=+49.251366984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls") pod "dns-default-gngqn" (UID: "58441859-cdfd-435c-a17f-f77225ae4513") : secret "dns-default-metrics-tls" not found Apr 20 10:02:07.231006 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:07.230536 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 10:02:07.231006 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:07.230579 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert podName:033bb25f-40eb-4a7f-ad48-e552fca86c6d nodeName:}" failed. No retries permitted until 2026-04-20 10:02:15.230564073 +0000 UTC m=+49.251410283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert") pod "ingress-canary-f6m7h" (UID: "033bb25f-40eb-4a7f-ad48-e552fca86c6d") : secret "canary-serving-cert" not found Apr 20 10:02:15.284799 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.284765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:15.285359 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.284836 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:15.288147 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.288125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/58441859-cdfd-435c-a17f-f77225ae4513-metrics-tls\") pod \"dns-default-gngqn\" (UID: \"58441859-cdfd-435c-a17f-f77225ae4513\") " pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:15.288241 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.288154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/033bb25f-40eb-4a7f-ad48-e552fca86c6d-cert\") pod \"ingress-canary-f6m7h\" (UID: \"033bb25f-40eb-4a7f-ad48-e552fca86c6d\") " pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:15.341806 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.341784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:15.347365 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.347345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f6m7h" Apr 20 10:02:15.494633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.494600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-gngqn"] Apr 20 10:02:15.499479 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:15.499454 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58441859_cdfd_435c_a17f_f77225ae4513.slice/crio-3de6117b1237db53fd7963dddef5a79669f35b5ba1154d29d0743addb54e1e42 WatchSource:0}: Error finding container 3de6117b1237db53fd7963dddef5a79669f35b5ba1154d29d0743addb54e1e42: Status 404 returned error can't find the container with id 3de6117b1237db53fd7963dddef5a79669f35b5ba1154d29d0743addb54e1e42 Apr 20 10:02:15.515481 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.515462 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f6m7h"] Apr 20 10:02:15.521160 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:15.521142 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod033bb25f_40eb_4a7f_ad48_e552fca86c6d.slice/crio-b561fd511143bb3380e6da1bf6fe209f3a15477f9b335cd9645f982836bd4877 WatchSource:0}: Error finding container b561fd511143bb3380e6da1bf6fe209f3a15477f9b335cd9645f982836bd4877: Status 404 returned error can't find the container with id b561fd511143bb3380e6da1bf6fe209f3a15477f9b335cd9645f982836bd4877 Apr 20 10:02:15.759126 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.759087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f6m7h" event={"ID":"033bb25f-40eb-4a7f-ad48-e552fca86c6d","Type":"ContainerStarted","Data":"b561fd511143bb3380e6da1bf6fe209f3a15477f9b335cd9645f982836bd4877"} Apr 20 10:02:15.760116 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:15.760081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gngqn" event={"ID":"58441859-cdfd-435c-a17f-f77225ae4513","Type":"ContainerStarted","Data":"3de6117b1237db53fd7963dddef5a79669f35b5ba1154d29d0743addb54e1e42"} Apr 20 10:02:18.767721 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:18.767683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f6m7h" event={"ID":"033bb25f-40eb-4a7f-ad48-e552fca86c6d","Type":"ContainerStarted","Data":"82310c5af27af9a791bf9dbcd568299ab5a4a2cd5d6caa6438ef78b2c4d624e1"} Apr 20 10:02:18.769219 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:18.769196 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gngqn" event={"ID":"58441859-cdfd-435c-a17f-f77225ae4513","Type":"ContainerStarted","Data":"e5e4211f471f6d38091b5421ca687d68ee2ae96a2451236e2ddedbd393c0cb13"} Apr 20 10:02:18.769330 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:18.769225 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-gngqn" event={"ID":"58441859-cdfd-435c-a17f-f77225ae4513","Type":"ContainerStarted","Data":"2c2db2013cca22f2d9a9ef0094fb1c22bd30e8cc5bfb281ed8e720336b90ce87"} Apr 20 10:02:18.769393 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:18.769374 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:18.783829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:18.783777 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f6m7h" podStartSLOduration=17.275934801 podStartE2EDuration="19.783763852s" podCreationTimestamp="2026-04-20 10:01:59 +0000 UTC" firstStartedPulling="2026-04-20 10:02:15.522873957 +0000 UTC m=+49.543720159" lastFinishedPulling="2026-04-20 10:02:18.030703009 +0000 UTC m=+52.051549210" observedRunningTime="2026-04-20 10:02:18.783262126 +0000 UTC m=+52.804108356" watchObservedRunningTime="2026-04-20 10:02:18.783763852 +0000 UTC m=+52.804610076" Apr 20 10:02:20.273245 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.273198 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-gngqn" podStartSLOduration=18.74807403 podStartE2EDuration="21.27318092s" podCreationTimestamp="2026-04-20 10:01:59 +0000 UTC" firstStartedPulling="2026-04-20 10:02:15.501181872 +0000 UTC m=+49.522028077" lastFinishedPulling="2026-04-20 10:02:18.026288761 +0000 UTC m=+52.047134967" observedRunningTime="2026-04-20 10:02:18.800209668 +0000 UTC m=+52.821055906" watchObservedRunningTime="2026-04-20 10:02:20.27318092 +0000 UTC m=+54.294027143" Apr 20 10:02:20.273692 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.273651 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b68f8c6f9-rrmkb"] Apr 20 10:02:20.302169 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.302140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.307243 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.307216 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 10:02:20.307367 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.307335 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b68f8c6f9-rrmkb"] Apr 20 10:02:20.309400 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.309363 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w6bmx\"" Apr 20 10:02:20.309515 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.309499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 10:02:20.314427 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.314241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 10:02:20.332741 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.332725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 10:02:20.337199 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.337179 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-msdsc"] Apr 20 10:02:20.357343 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.357323 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-msdsc"] Apr 20 10:02:20.357417 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.357402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.361609 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.361584 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 10:02:20.361705 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.361687 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 10:02:20.362459 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.362439 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 10:02:20.362552 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.362513 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 10:02:20.363400 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.363381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wtn5t\"" Apr 20 10:02:20.419962 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.419943 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-registry-tls\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420060 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.419967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-registry-certificates\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420060 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.419995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-image-registry-private-configuration\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420060 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.420011 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-installation-pull-secrets\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.420095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh4sq\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-kube-api-access-gh4sq\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.420137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-bound-sa-token\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.420168 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c58a395e-b75a-40a0-b474-87562006f6e1-ca-trust-extracted\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.420225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.420205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-trusted-ca\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520546 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-registry-tls\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-registry-certificates\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520580 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-image-registry-private-configuration\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-installation-pull-secrets\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5775bdcc-774b-426c-b355-50ab682f46eb-data-volume\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh4sq\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-kube-api-access-gh4sq\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.520641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5775bdcc-774b-426c-b355-50ab682f46eb-crio-socket\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520693 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbp2h\" (UniqueName: \"kubernetes.io/projected/5775bdcc-774b-426c-b355-50ab682f46eb-kube-api-access-sbp2h\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-bound-sa-token\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c58a395e-b75a-40a0-b474-87562006f6e1-ca-trust-extracted\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5775bdcc-774b-426c-b355-50ab682f46eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-trusted-ca\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.521002 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.520850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5775bdcc-774b-426c-b355-50ab682f46eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.521295 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.521197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c58a395e-b75a-40a0-b474-87562006f6e1-ca-trust-extracted\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.521629 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.521607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-registry-certificates\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.521799 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.521778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c58a395e-b75a-40a0-b474-87562006f6e1-trusted-ca\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.523036 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.523011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-image-registry-private-configuration\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.523135 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.523098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-registry-tls\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.523276 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.523235 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c58a395e-b75a-40a0-b474-87562006f6e1-installation-pull-secrets\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.529235 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.529210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh4sq\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-kube-api-access-gh4sq\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.529771 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.529755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c58a395e-b75a-40a0-b474-87562006f6e1-bound-sa-token\") pod \"image-registry-6b68f8c6f9-rrmkb\" (UID: \"c58a395e-b75a-40a0-b474-87562006f6e1\") " pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.611446 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.611423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:20.621161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5775bdcc-774b-426c-b355-50ab682f46eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621248 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5775bdcc-774b-426c-b355-50ab682f46eb-data-volume\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621298 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5775bdcc-774b-426c-b355-50ab682f46eb-crio-socket\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621368 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbp2h\" (UniqueName: \"kubernetes.io/projected/5775bdcc-774b-426c-b355-50ab682f46eb-kube-api-access-sbp2h\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621368 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5775bdcc-774b-426c-b355-50ab682f46eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621514 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621486 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5775bdcc-774b-426c-b355-50ab682f46eb-crio-socket\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621639 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5775bdcc-774b-426c-b355-50ab682f46eb-data-volume\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.621889 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.621870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5775bdcc-774b-426c-b355-50ab682f46eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.623845 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.623826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5775bdcc-774b-426c-b355-50ab682f46eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.631656 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.631629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbp2h\" (UniqueName: \"kubernetes.io/projected/5775bdcc-774b-426c-b355-50ab682f46eb-kube-api-access-sbp2h\") pod \"insights-runtime-extractor-msdsc\" (UID: \"5775bdcc-774b-426c-b355-50ab682f46eb\") " pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.665818 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.665791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-msdsc" Apr 20 10:02:20.727402 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.727350 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b68f8c6f9-rrmkb"] Apr 20 10:02:20.733116 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:20.733077 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58a395e_b75a_40a0_b474_87562006f6e1.slice/crio-4de085645cd3b4c1d85ff5909daa994d811c3bb04594df449c913730ae1cef22 WatchSource:0}: Error finding container 4de085645cd3b4c1d85ff5909daa994d811c3bb04594df449c913730ae1cef22: Status 404 returned error can't find the container with id 4de085645cd3b4c1d85ff5909daa994d811c3bb04594df449c913730ae1cef22 Apr 20 10:02:20.773729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.773677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" event={"ID":"c58a395e-b75a-40a0-b474-87562006f6e1","Type":"ContainerStarted","Data":"4de085645cd3b4c1d85ff5909daa994d811c3bb04594df449c913730ae1cef22"} Apr 20 10:02:20.802322 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:20.802282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-msdsc"] Apr 20 10:02:20.805127 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:20.805102 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5775bdcc_774b_426c_b355_50ab682f46eb.slice/crio-a4c60a880c7246287b0927f212e4d340ab4cefd2041459667018cdc237bc3281 WatchSource:0}: Error finding container a4c60a880c7246287b0927f212e4d340ab4cefd2041459667018cdc237bc3281: Status 404 returned error can't find the container with id a4c60a880c7246287b0927f212e4d340ab4cefd2041459667018cdc237bc3281 Apr 20 10:02:21.777678 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:21.777650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" event={"ID":"c58a395e-b75a-40a0-b474-87562006f6e1","Type":"ContainerStarted","Data":"63f7c6f0aeb452d32589cb38145cb242a4303f513cebd20094c637718d60d68c"} Apr 20 10:02:21.778015 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:21.777776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:21.779340 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:21.779318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-msdsc" event={"ID":"5775bdcc-774b-426c-b355-50ab682f46eb","Type":"ContainerStarted","Data":"0b83e72226d21c361d3a593190e360b40710e144c380e81f4cee359a33d0ae97"} Apr 20 10:02:21.779417 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:21.779353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-msdsc" event={"ID":"5775bdcc-774b-426c-b355-50ab682f46eb","Type":"ContainerStarted","Data":"bc7e4b5818ca12da49b054967fecbf603905585e8fc9f0bda266e6fbe60dc492"} Apr 20 10:02:21.779417 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:21.779366 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-msdsc" event={"ID":"5775bdcc-774b-426c-b355-50ab682f46eb","Type":"ContainerStarted","Data":"a4c60a880c7246287b0927f212e4d340ab4cefd2041459667018cdc237bc3281"} Apr 20 10:02:23.785671 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:23.785640 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-msdsc" event={"ID":"5775bdcc-774b-426c-b355-50ab682f46eb","Type":"ContainerStarted","Data":"40421d16a78d5206628f3df5e05a9103e4b731735e9aede345c4a737042ac2c7"} Apr 20 10:02:23.804500 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:23.804461 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" podStartSLOduration=3.804448668 podStartE2EDuration="3.804448668s" podCreationTimestamp="2026-04-20 10:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:02:21.799764762 +0000 UTC m=+55.820610985" watchObservedRunningTime="2026-04-20 10:02:23.804448668 +0000 UTC m=+57.825294891" Apr 20 10:02:23.804732 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:23.804713 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-msdsc" podStartSLOduration=1.35933852 podStartE2EDuration="3.804708344s" podCreationTimestamp="2026-04-20 10:02:20 +0000 UTC" firstStartedPulling="2026-04-20 10:02:20.855721681 +0000 UTC m=+54.876567892" lastFinishedPulling="2026-04-20 10:02:23.301091514 +0000 UTC m=+57.321937716" observedRunningTime="2026-04-20 10:02:23.803566793 +0000 UTC m=+57.824413017" watchObservedRunningTime="2026-04-20 10:02:23.804708344 +0000 UTC m=+57.825554568" Apr 20 10:02:25.721691 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:25.721666 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwdv7" Apr 20 10:02:26.621812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.621786 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c"] Apr 20 10:02:26.626197 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.626182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:26.629888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.629869 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cdqnp\"" Apr 20 10:02:26.630158 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.630141 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 10:02:26.637350 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.637331 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c"] Apr 20 10:02:26.762978 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.762952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dab700b5-5990-4d1f-882a-c2eef84a1305-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zdb9c\" (UID: \"dab700b5-5990-4d1f-882a-c2eef84a1305\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:26.863903 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.863880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dab700b5-5990-4d1f-882a-c2eef84a1305-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zdb9c\" (UID: \"dab700b5-5990-4d1f-882a-c2eef84a1305\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:26.866627 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.866610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dab700b5-5990-4d1f-882a-c2eef84a1305-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-zdb9c\" (UID: \"dab700b5-5990-4d1f-882a-c2eef84a1305\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:26.934327 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:26.934260 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:27.045198 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:27.045167 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c"] Apr 20 10:02:27.048636 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:27.048607 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddab700b5_5990_4d1f_882a_c2eef84a1305.slice/crio-4712fe282ed6f15f0c8e6b36c0c146e72a9fc722cb1db56474b222350de1769e WatchSource:0}: Error finding container 4712fe282ed6f15f0c8e6b36c0c146e72a9fc722cb1db56474b222350de1769e: Status 404 returned error can't find the container with id 4712fe282ed6f15f0c8e6b36c0c146e72a9fc722cb1db56474b222350de1769e Apr 20 10:02:27.796877 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:27.796839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" event={"ID":"dab700b5-5990-4d1f-882a-c2eef84a1305","Type":"ContainerStarted","Data":"4712fe282ed6f15f0c8e6b36c0c146e72a9fc722cb1db56474b222350de1769e"} Apr 20 10:02:28.772956 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:28.772927 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-gngqn" Apr 20 10:02:28.802996 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:28.802960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" event={"ID":"dab700b5-5990-4d1f-882a-c2eef84a1305","Type":"ContainerStarted","Data":"0fcf6904456761ac7869c1eef06d9677f07322c208ce14a755ea971ee67ab732"} Apr 20 10:02:28.803452 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:28.803432 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:28.808474 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:28.808451 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" Apr 20 10:02:28.818324 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:28.818258 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-zdb9c" podStartSLOduration=1.758101988 podStartE2EDuration="2.818242111s" podCreationTimestamp="2026-04-20 10:02:26 +0000 UTC" firstStartedPulling="2026-04-20 10:02:27.050643979 +0000 UTC m=+61.071490184" lastFinishedPulling="2026-04-20 10:02:28.110784105 +0000 UTC m=+62.131630307" observedRunningTime="2026-04-20 10:02:28.818091108 +0000 UTC m=+62.838937367" watchObservedRunningTime="2026-04-20 10:02:28.818242111 +0000 UTC m=+62.839088335" Apr 20 10:02:29.671443 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.671410 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vt54g"] Apr 20 10:02:29.674240 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.674224 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.677291 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qt5vq\"" Apr 20 10:02:29.677406 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677350 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 10:02:29.677406 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677367 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 10:02:29.677511 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677419 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 10:02:29.677511 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 10:02:29.677664 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.677639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 10:02:29.683964 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.683944 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vt54g"] Apr 20 10:02:29.785503 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.785478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.785592 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.785509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknfx\" (UniqueName: \"kubernetes.io/projected/45c27faf-167f-454a-9c28-7d7fa2f034a7-kube-api-access-sknfx\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.785592 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.785541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c27faf-167f-454a-9c28-7d7fa2f034a7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.785675 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.785636 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.886114 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.886093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sknfx\" (UniqueName: \"kubernetes.io/projected/45c27faf-167f-454a-9c28-7d7fa2f034a7-kube-api-access-sknfx\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.886455 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.886137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c27faf-167f-454a-9c28-7d7fa2f034a7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.886455 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.886180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.886455 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.886205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.886455 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:29.886286 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 10:02:29.886455 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:29.886360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls podName:45c27faf-167f-454a-9c28-7d7fa2f034a7 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:30.386343187 +0000 UTC m=+64.407189406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-vt54g" (UID: "45c27faf-167f-454a-9c28-7d7fa2f034a7") : secret "prometheus-operator-tls" not found Apr 20 10:02:29.886846 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.886828 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45c27faf-167f-454a-9c28-7d7fa2f034a7-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.888559 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.888538 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:29.896764 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:29.896742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknfx\" (UniqueName: \"kubernetes.io/projected/45c27faf-167f-454a-9c28-7d7fa2f034a7-kube-api-access-sknfx\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:30.389620 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:30.389585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:30.391790 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:30.391768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c27faf-167f-454a-9c28-7d7fa2f034a7-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-vt54g\" (UID: \"45c27faf-167f-454a-9c28-7d7fa2f034a7\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:30.583022 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:30.582994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" Apr 20 10:02:30.707042 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:30.707015 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-vt54g"] Apr 20 10:02:30.709698 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:30.709673 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c27faf_167f_454a_9c28_7d7fa2f034a7.slice/crio-b0b518b2a2c3273256f06bccf27b3ff35d26550ecca906e823c12b7a6f5d36fe WatchSource:0}: Error finding container b0b518b2a2c3273256f06bccf27b3ff35d26550ecca906e823c12b7a6f5d36fe: Status 404 returned error can't find the container with id b0b518b2a2c3273256f06bccf27b3ff35d26550ecca906e823c12b7a6f5d36fe Apr 20 10:02:30.809112 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:30.809086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" event={"ID":"45c27faf-167f-454a-9c28-7d7fa2f034a7","Type":"ContainerStarted","Data":"b0b518b2a2c3273256f06bccf27b3ff35d26550ecca906e823c12b7a6f5d36fe"} Apr 20 10:02:32.201656 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.201620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:02:32.204114 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.204096 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 10:02:32.214629 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.214607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aa479de0-842b-41a8-952f-4382abbdf250-metrics-certs\") pod \"network-metrics-daemon-4htjt\" (UID: \"aa479de0-842b-41a8-952f-4382abbdf250\") " pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:02:32.302025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.302000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:32.304341 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.304277 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 10:02:32.314777 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.314762 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 10:02:32.325500 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.325482 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56v2q\" (UniqueName: \"kubernetes.io/projected/340db265-d04c-46d7-b5b0-6141dced7313-kube-api-access-56v2q\") pod \"network-check-target-b258f\" (UID: \"340db265-d04c-46d7-b5b0-6141dced7313\") " pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:32.347542 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.347521 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-plq86\"" Apr 20 10:02:32.352139 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.352125 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sjrcf\"" Apr 20 10:02:32.355624 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.355610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:32.361170 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.361156 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4htjt" Apr 20 10:02:32.657391 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.657342 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4htjt"] Apr 20 10:02:32.660332 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:32.660287 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa479de0_842b_41a8_952f_4382abbdf250.slice/crio-3cef2496dd78abf02fb39319d197a48f9e671687f8fc62964bdc082ee56a2233 WatchSource:0}: Error finding container 3cef2496dd78abf02fb39319d197a48f9e671687f8fc62964bdc082ee56a2233: Status 404 returned error can't find the container with id 3cef2496dd78abf02fb39319d197a48f9e671687f8fc62964bdc082ee56a2233 Apr 20 10:02:32.676670 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.676643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b258f"] Apr 20 10:02:32.680328 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:32.680280 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340db265_d04c_46d7_b5b0_6141dced7313.slice/crio-12cceced10b16a85270eae91aeebc7862948743572be75b01af3f39e038b4258 WatchSource:0}: Error finding container 12cceced10b16a85270eae91aeebc7862948743572be75b01af3f39e038b4258: Status 404 returned error can't find the container with id 12cceced10b16a85270eae91aeebc7862948743572be75b01af3f39e038b4258 Apr 20 10:02:32.816091 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.816024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b258f" event={"ID":"340db265-d04c-46d7-b5b0-6141dced7313","Type":"ContainerStarted","Data":"12cceced10b16a85270eae91aeebc7862948743572be75b01af3f39e038b4258"} Apr 20 10:02:32.817274 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.817245 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4htjt" event={"ID":"aa479de0-842b-41a8-952f-4382abbdf250","Type":"ContainerStarted","Data":"3cef2496dd78abf02fb39319d197a48f9e671687f8fc62964bdc082ee56a2233"} Apr 20 10:02:32.819031 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.819008 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" event={"ID":"45c27faf-167f-454a-9c28-7d7fa2f034a7","Type":"ContainerStarted","Data":"6fe99521c862ebb632951c5d68163d8abb12fd0823d55c18d64d3cc7f0333de3"} Apr 20 10:02:32.819161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.819035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" event={"ID":"45c27faf-167f-454a-9c28-7d7fa2f034a7","Type":"ContainerStarted","Data":"e43e2626b6d49ddd7c96ed21efceaeaf65f1869696eac90967cb6ead8eea1292"} Apr 20 10:02:32.850216 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:32.850165 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-vt54g" podStartSLOduration=2.021703712 podStartE2EDuration="3.850149496s" podCreationTimestamp="2026-04-20 10:02:29 +0000 UTC" firstStartedPulling="2026-04-20 10:02:30.711397693 +0000 UTC m=+64.732243894" lastFinishedPulling="2026-04-20 10:02:32.539843461 +0000 UTC m=+66.560689678" observedRunningTime="2026-04-20 10:02:32.850001905 +0000 UTC m=+66.870848130" watchObservedRunningTime="2026-04-20 10:02:32.850149496 +0000 UTC m=+66.870995722" Apr 20 10:02:34.828150 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:34.828040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4htjt" event={"ID":"aa479de0-842b-41a8-952f-4382abbdf250","Type":"ContainerStarted","Data":"e7a71b9673dc55981acdca075fbca5f2e851f29853c37b32c6610d917effa5c3"} Apr 20 10:02:34.828150 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:34.828081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4htjt" event={"ID":"aa479de0-842b-41a8-952f-4382abbdf250","Type":"ContainerStarted","Data":"b8c16bbe3ee360264121d95ad728bf89f160df7fc6a2cb7c837e3f124a4bdf48"} Apr 20 10:02:34.848051 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:34.847956 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4htjt" podStartSLOduration=67.378708181 podStartE2EDuration="1m8.847938449s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:02:32.662174087 +0000 UTC m=+66.683020290" lastFinishedPulling="2026-04-20 10:02:34.131404355 +0000 UTC m=+68.152250558" observedRunningTime="2026-04-20 10:02:34.847648951 +0000 UTC m=+68.868495176" watchObservedRunningTime="2026-04-20 10:02:34.847938449 +0000 UTC m=+68.868784674" Apr 20 10:02:35.025297 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.025261 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vq444"] Apr 20 10:02:35.051939 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.051913 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vq444"] Apr 20 10:02:35.052093 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.052050 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.054612 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.054591 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 10:02:35.054708 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.054613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-z7zjq\"" Apr 20 10:02:35.054708 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.054613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 10:02:35.057773 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.057751 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ffbl4"] Apr 20 10:02:35.073333 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.073293 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jpgzh"] Apr 20 10:02:35.073490 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.073472 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.076100 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.076078 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 10:02:35.076624 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.076451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-27cts\"" Apr 20 10:02:35.076624 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.076484 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 10:02:35.077074 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.076787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 10:02:35.085721 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.085704 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ffbl4"] Apr 20 10:02:35.085852 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.085834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.088396 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.088130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gq8bt\"" Apr 20 10:02:35.088396 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.088214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 10:02:35.088396 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.088251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 10:02:35.088649 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.088536 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 10:02:35.122457 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-sys\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122568 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122479 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-textfile\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122568 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-metrics-client-ca\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122568 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.122568 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67c5326b-25f5-40b4-9fb5-76cb61e11800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.122770 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.122770 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122770 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbh7d\" (UniqueName: \"kubernetes.io/projected/67c5326b-25f5-40b4-9fb5-76cb61e11800-kube-api-access-nbh7d\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.122770 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-wtmp\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122770 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122729 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122963 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b0b1003b-e990-4b70-bfb9-08b2cd905f97-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.122963 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.122963 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122869 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.122963 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122919 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-root\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.123153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.122963 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cp8l\" (UniqueName: \"kubernetes.io/projected/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-api-access-6cp8l\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.123153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.123001 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.123153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.123030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdxk4\" (UniqueName: \"kubernetes.io/projected/c7ca1e44-73f9-4982-9492-529ac3ad8e18-kube-api-access-zdxk4\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.123153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.123076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.123153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.123102 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.224206 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-sys\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224223 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-textfile\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-metrics-client-ca\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224318 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67c5326b-25f5-40b4-9fb5-76cb61e11800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-sys\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224364 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224409 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbh7d\" (UniqueName: \"kubernetes.io/projected/67c5326b-25f5-40b4-9fb5-76cb61e11800-kube-api-access-nbh7d\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224436 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-wtmp\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224469 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224496 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b0b1003b-e990-4b70-bfb9-08b2cd905f97-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224549 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-root\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.224710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-textfile\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.225119 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.224612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cp8l\" (UniqueName: \"kubernetes.io/projected/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-api-access-6cp8l\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.225119 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225058 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.225119 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:35.225076 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 10:02:35.225119 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225095 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdxk4\" (UniqueName: \"kubernetes.io/projected/c7ca1e44-73f9-4982-9492-529ac3ad8e18-kube-api-access-zdxk4\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.225119 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-metrics-client-ca\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.225405 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.225405 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:35.225148 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls podName:c7ca1e44-73f9-4982-9492-529ac3ad8e18 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:35.725128002 +0000 UTC m=+69.745974212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls") pod "node-exporter-jpgzh" (UID: "c7ca1e44-73f9-4982-9492-529ac3ad8e18") : secret "node-exporter-tls" not found Apr 20 10:02:35.225405 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-wtmp\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.225405 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/67c5326b-25f5-40b4-9fb5-76cb61e11800-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.225405 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225280 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.225678 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:35.225419 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 10:02:35.225678 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:35.225471 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls podName:67c5326b-25f5-40b4-9fb5-76cb61e11800 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:35.725453687 +0000 UTC m=+69.746299906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-vq444" (UID: "67c5326b-25f5-40b4-9fb5-76cb61e11800") : secret "openshift-state-metrics-tls" not found Apr 20 10:02:35.225678 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c7ca1e44-73f9-4982-9492-529ac3ad8e18-root\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.225678 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b0b1003b-e990-4b70-bfb9-08b2cd905f97-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.226006 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.225975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-accelerators-collector-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.228124 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.228079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.228335 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.228291 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.228852 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.228811 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.229630 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.229604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.233937 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.233893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.234022 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.233953 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0b1003b-e990-4b70-bfb9-08b2cd905f97-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.240936 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.240896 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbh7d\" (UniqueName: \"kubernetes.io/projected/67c5326b-25f5-40b4-9fb5-76cb61e11800-kube-api-access-nbh7d\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.241620 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.241595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdxk4\" (UniqueName: \"kubernetes.io/projected/c7ca1e44-73f9-4982-9492-529ac3ad8e18-kube-api-access-zdxk4\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.242164 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.242144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cp8l\" (UniqueName: \"kubernetes.io/projected/b0b1003b-e990-4b70-bfb9-08b2cd905f97-kube-api-access-6cp8l\") pod \"kube-state-metrics-69db897b98-ffbl4\" (UID: \"b0b1003b-e990-4b70-bfb9-08b2cd905f97\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.386862 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.386784 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" Apr 20 10:02:35.720134 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.720101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-ffbl4"] Apr 20 10:02:35.722894 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:35.722866 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0b1003b_e990_4b70_bfb9_08b2cd905f97.slice/crio-f027518a41b2fc09b451c3b19cd94540b4a9d6afcc5975edbc6e9065ebc17939 WatchSource:0}: Error finding container f027518a41b2fc09b451c3b19cd94540b4a9d6afcc5975edbc6e9065ebc17939: Status 404 returned error can't find the container with id f027518a41b2fc09b451c3b19cd94540b4a9d6afcc5975edbc6e9065ebc17939 Apr 20 10:02:35.728919 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.728896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.729011 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.728951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.731151 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.731126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1e44-73f9-4982-9492-529ac3ad8e18-node-exporter-tls\") pod \"node-exporter-jpgzh\" (UID: \"c7ca1e44-73f9-4982-9492-529ac3ad8e18\") " pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:35.731244 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.731187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c5326b-25f5-40b4-9fb5-76cb61e11800-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-vq444\" (UID: \"67c5326b-25f5-40b4-9fb5-76cb61e11800\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.831268 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.831226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b258f" event={"ID":"340db265-d04c-46d7-b5b0-6141dced7313","Type":"ContainerStarted","Data":"083fb40848dd91c509d5467161f007d898de17432c4d1bfc0b9984f4a3909617"} Apr 20 10:02:35.831589 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.831315 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:02:35.832347 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.832296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" event={"ID":"b0b1003b-e990-4b70-bfb9-08b2cd905f97","Type":"ContainerStarted","Data":"f027518a41b2fc09b451c3b19cd94540b4a9d6afcc5975edbc6e9065ebc17939"} Apr 20 10:02:35.847001 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.846959 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b258f" podStartSLOduration=66.868951074 podStartE2EDuration="1m9.846946925s" podCreationTimestamp="2026-04-20 10:01:26 +0000 UTC" firstStartedPulling="2026-04-20 10:02:32.682427422 +0000 UTC m=+66.703273623" lastFinishedPulling="2026-04-20 10:02:35.66042327 +0000 UTC m=+69.681269474" observedRunningTime="2026-04-20 10:02:35.845784857 +0000 UTC m=+69.866631082" watchObservedRunningTime="2026-04-20 10:02:35.846946925 +0000 UTC m=+69.867793148" Apr 20 10:02:35.965277 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.965220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" Apr 20 10:02:35.996148 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:35.996121 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jpgzh" Apr 20 10:02:36.004758 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:36.004669 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ca1e44_73f9_4982_9492_529ac3ad8e18.slice/crio-6a01495aa588bddd61f646dcb2596c1657a462bb23445aa7768535e9536a74b1 WatchSource:0}: Error finding container 6a01495aa588bddd61f646dcb2596c1657a462bb23445aa7768535e9536a74b1: Status 404 returned error can't find the container with id 6a01495aa588bddd61f646dcb2596c1657a462bb23445aa7768535e9536a74b1 Apr 20 10:02:36.092857 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.092743 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-vq444"] Apr 20 10:02:36.095423 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:36.095388 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c5326b_25f5_40b4_9fb5_76cb61e11800.slice/crio-b8b6df9727c0d76901348d1d08d3233893f42a047728f42f7fbcdcde6edfc52a WatchSource:0}: Error finding container b8b6df9727c0d76901348d1d08d3233893f42a047728f42f7fbcdcde6edfc52a: Status 404 returned error can't find the container with id b8b6df9727c0d76901348d1d08d3233893f42a047728f42f7fbcdcde6edfc52a Apr 20 10:02:36.102591 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.102572 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:02:36.110775 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.110756 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.112928 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.112891 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 10:02:36.113048 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.112953 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 10:02:36.113173 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113139 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 10:02:36.113275 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113194 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 10:02:36.113275 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113202 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lrsxl\"" Apr 20 10:02:36.113507 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113468 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 10:02:36.113507 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113494 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 10:02:36.113622 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113524 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 10:02:36.113622 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113543 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 10:02:36.113622 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.113497 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 10:02:36.118896 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.118878 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn2dp\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132416 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.132927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.132776 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234054 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234174 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234174 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234174 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234174 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nn2dp\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234412 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234693 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:02:36.234432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle podName:39129b52-8ee4-42a2-8c1c-8b5ff26944d2 nodeName:}" failed. No retries permitted until 2026-04-20 10:02:36.734407084 +0000 UTC m=+70.755253289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2") : configmap references non-existent config key: ca-bundle.crt Apr 20 10:02:36.234693 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234693 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.234887 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.234688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.235603 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.235547 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237016 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.236966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237107 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237107 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237458 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237662 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237641 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237709 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.237916 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.237899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.238790 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.238619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.239415 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.239396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.246628 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.246605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn2dp\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.738794 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.738757 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.739605 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.739575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.758193 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.758171 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:02:36.839111 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.839069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpgzh" event={"ID":"c7ca1e44-73f9-4982-9492-529ac3ad8e18","Type":"ContainerStarted","Data":"6a01495aa588bddd61f646dcb2596c1657a462bb23445aa7768535e9536a74b1"} Apr 20 10:02:36.841578 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.841550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" event={"ID":"67c5326b-25f5-40b4-9fb5-76cb61e11800","Type":"ContainerStarted","Data":"1ec7f3289eedf73284dc96e6ca91e631538688833f5c74607db489d2ebc57ad2"} Apr 20 10:02:36.841682 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.841668 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" event={"ID":"67c5326b-25f5-40b4-9fb5-76cb61e11800","Type":"ContainerStarted","Data":"b97cbf91c96be3191c7cc12e52fe247f85ef94dca81e30a2488f61c0b52ed7ff"} Apr 20 10:02:36.841747 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:36.841688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" event={"ID":"67c5326b-25f5-40b4-9fb5-76cb61e11800","Type":"ContainerStarted","Data":"b8b6df9727c0d76901348d1d08d3233893f42a047728f42f7fbcdcde6edfc52a"} Apr 20 10:02:37.113510 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.113475 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-d5f699f68-zgg4r"] Apr 20 10:02:37.116771 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.116751 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.119296 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119274 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 10:02:37.119469 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119438 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 10:02:37.119582 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119517 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fmlg8eed3jijt\"" Apr 20 10:02:37.119582 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119529 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 10:02:37.119582 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119545 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 10:02:37.119756 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119518 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xqq8d\"" Apr 20 10:02:37.119899 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.119879 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 10:02:37.127656 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.127631 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d5f699f68-zgg4r"] Apr 20 10:02:37.141453 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141428 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141540 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141599 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-metrics-client-ca\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141654 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rwv\" (UniqueName: \"kubernetes.io/projected/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-kube-api-access-r2rwv\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141705 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141761 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.141812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.141790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-grpc-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.243802 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.243747 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-metrics-client-ca\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.243948 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.242992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-metrics-client-ca\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.244589 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.244567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rwv\" (UniqueName: \"kubernetes.io/projected/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-kube-api-access-r2rwv\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.244781 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.244765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.245363 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.245344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.246247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.246228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.246754 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.246738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-grpc-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.250188 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.248974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.250188 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.249014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.250910 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.250882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-grpc-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.251803 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.251753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.251892 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.251801 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.252878 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.252836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-tls\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.253710 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.253685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.254340 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.254315 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rwv\" (UniqueName: \"kubernetes.io/projected/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-kube-api-access-r2rwv\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.254590 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.254572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db351594-bd1d-4f30-a5a3-d8f3e7f8864b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-d5f699f68-zgg4r\" (UID: \"db351594-bd1d-4f30-a5a3-d8f3e7f8864b\") " pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.299833 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.299763 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:02:37.428033 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.428004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:37.572459 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.572426 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-d5f699f68-zgg4r"] Apr 20 10:02:37.750185 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:37.750115 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb351594_bd1d_4f30_a5a3_d8f3e7f8864b.slice/crio-b461e7e1b2cc2f8d3617b10e6fb1c0e0f9935a659723c9480117a262cea9286c WatchSource:0}: Error finding container b461e7e1b2cc2f8d3617b10e6fb1c0e0f9935a659723c9480117a262cea9286c: Status 404 returned error can't find the container with id b461e7e1b2cc2f8d3617b10e6fb1c0e0f9935a659723c9480117a262cea9286c Apr 20 10:02:37.849255 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.849228 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"b461e7e1b2cc2f8d3617b10e6fb1c0e0f9935a659723c9480117a262cea9286c"} Apr 20 10:02:37.850272 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.850242 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"6e2d68c546e27d0b1e6cb17b0bb89135f742ed06d51221ef3fd830ec1d961ef0"} Apr 20 10:02:37.852096 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.852063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" event={"ID":"b0b1003b-e990-4b70-bfb9-08b2cd905f97","Type":"ContainerStarted","Data":"c70c975bdaa88546ce13339094b97da181b8e3a403fc038dcd29ed3f068a2ff3"} Apr 20 10:02:37.852294 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.852096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" event={"ID":"b0b1003b-e990-4b70-bfb9-08b2cd905f97","Type":"ContainerStarted","Data":"74b01f30f1d9cf47de2624ec2eb3ba7124146094e8942048ef7dec54c2391903"} Apr 20 10:02:37.852294 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.852110 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" event={"ID":"b0b1003b-e990-4b70-bfb9-08b2cd905f97","Type":"ContainerStarted","Data":"281bd901e92bfa01458950d01813d20724a9048ae128d4d6197070d976a53a2b"} Apr 20 10:02:37.853697 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.853677 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7ca1e44-73f9-4982-9492-529ac3ad8e18" containerID="1a5df0f19a60f983d548ffaa0cb8ac4d98fbf03c9337227ea9789a805c7c2491" exitCode=0 Apr 20 10:02:37.853785 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.853721 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpgzh" event={"ID":"c7ca1e44-73f9-4982-9492-529ac3ad8e18","Type":"ContainerDied","Data":"1a5df0f19a60f983d548ffaa0cb8ac4d98fbf03c9337227ea9789a805c7c2491"} Apr 20 10:02:37.875899 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:37.875733 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-ffbl4" podStartSLOduration=1.4503564820000001 podStartE2EDuration="2.875716151s" podCreationTimestamp="2026-04-20 10:02:35 +0000 UTC" firstStartedPulling="2026-04-20 10:02:35.725000021 +0000 UTC m=+69.745846227" lastFinishedPulling="2026-04-20 10:02:37.150359685 +0000 UTC m=+71.171205896" observedRunningTime="2026-04-20 10:02:37.874928245 +0000 UTC m=+71.895774470" watchObservedRunningTime="2026-04-20 10:02:37.875716151 +0000 UTC m=+71.896562376" Apr 20 10:02:38.861213 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.861137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpgzh" event={"ID":"c7ca1e44-73f9-4982-9492-529ac3ad8e18","Type":"ContainerStarted","Data":"efd624b9a12cea429d5026da12cf248287b7219afb3ed9aa0dc4ac8e6e446440"} Apr 20 10:02:38.861213 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.861181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jpgzh" event={"ID":"c7ca1e44-73f9-4982-9492-529ac3ad8e18","Type":"ContainerStarted","Data":"6d59582bdefa47781c1ed37e59cdbc98904cf3add5b2c4c19efcf07325ca89ca"} Apr 20 10:02:38.863289 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.863239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" event={"ID":"67c5326b-25f5-40b4-9fb5-76cb61e11800","Type":"ContainerStarted","Data":"27911e528e6b5f2b62ae6be2bd5f3b4370836a4dd698664d4ea8b7c5e081e4b9"} Apr 20 10:02:38.864537 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.864510 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312" exitCode=0 Apr 20 10:02:38.864655 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.864584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312"} Apr 20 10:02:38.883325 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.883266 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jpgzh" podStartSLOduration=2.738427288 podStartE2EDuration="3.883243527s" podCreationTimestamp="2026-04-20 10:02:35 +0000 UTC" firstStartedPulling="2026-04-20 10:02:36.006842999 +0000 UTC m=+70.027689201" lastFinishedPulling="2026-04-20 10:02:37.151659222 +0000 UTC m=+71.172505440" observedRunningTime="2026-04-20 10:02:38.881521 +0000 UTC m=+72.902367253" watchObservedRunningTime="2026-04-20 10:02:38.883243527 +0000 UTC m=+72.904089751" Apr 20 10:02:38.899021 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:38.898987 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-vq444" podStartSLOduration=2.362256972 podStartE2EDuration="3.898972744s" podCreationTimestamp="2026-04-20 10:02:35 +0000 UTC" firstStartedPulling="2026-04-20 10:02:36.256713567 +0000 UTC m=+70.277559769" lastFinishedPulling="2026-04-20 10:02:37.793429335 +0000 UTC m=+71.814275541" observedRunningTime="2026-04-20 10:02:38.898485118 +0000 UTC m=+72.919331342" watchObservedRunningTime="2026-04-20 10:02:38.898972744 +0000 UTC m=+72.919818969" Apr 20 10:02:39.338524 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.338493 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-668b98d794-gx58x"] Apr 20 10:02:39.341574 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.341556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.344123 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344104 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 10:02:39.344563 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344547 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-efdiehgikaoih\"" Apr 20 10:02:39.344665 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 10:02:39.344729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344709 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 10:02:39.344786 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344723 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 10:02:39.344786 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.344753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-tw6x4\"" Apr 20 10:02:39.356125 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.356087 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-668b98d794-gx58x"] Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.365989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-client-certs\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366056 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-client-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366126 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c0820ec-79be-4122-9bc9-af1459969f09-audit-log\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-metrics-server-audit-profiles\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366191 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5dq\" (UniqueName: \"kubernetes.io/projected/1c0820ec-79be-4122-9bc9-af1459969f09-kube-api-access-xs5dq\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.366317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.366230 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-tls\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466787 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-client-certs\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-client-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466859 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c0820ec-79be-4122-9bc9-af1459969f09-audit-log\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-metrics-server-audit-profiles\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5dq\" (UniqueName: \"kubernetes.io/projected/1c0820ec-79be-4122-9bc9-af1459969f09-kube-api-access-xs5dq\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.466938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.466931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-tls\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.467601 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.467544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/1c0820ec-79be-4122-9bc9-af1459969f09-audit-log\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.467708 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.467620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.467920 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.467899 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/1c0820ec-79be-4122-9bc9-af1459969f09-metrics-server-audit-profiles\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.469585 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.469562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-client-ca-bundle\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.469694 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.469590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-client-certs\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.470141 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.470121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/1c0820ec-79be-4122-9bc9-af1459969f09-secret-metrics-server-tls\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.475228 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.475204 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5dq\" (UniqueName: \"kubernetes.io/projected/1c0820ec-79be-4122-9bc9-af1459969f09-kube-api-access-xs5dq\") pod \"metrics-server-668b98d794-gx58x\" (UID: \"1c0820ec-79be-4122-9bc9-af1459969f09\") " pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.651133 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.651058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:39.796910 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.796878 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls"] Apr 20 10:02:39.800988 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.800605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:39.802736 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.802639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 10:02:39.802736 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.802689 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-skfbz\"" Apr 20 10:02:39.808267 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.808229 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls"] Apr 20 10:02:39.873886 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.870368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb9843a-089e-47f2-91c3-e7ca46156663-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8jls\" (UID: \"7bb9843a-089e-47f2-91c3-e7ca46156663\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:39.972600 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.971874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb9843a-089e-47f2-91c3-e7ca46156663-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8jls\" (UID: \"7bb9843a-089e-47f2-91c3-e7ca46156663\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:39.975142 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:39.975093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7bb9843a-089e-47f2-91c3-e7ca46156663-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-w8jls\" (UID: \"7bb9843a-089e-47f2-91c3-e7ca46156663\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:40.113765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.113740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:40.140686 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.140659 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-668b98d794-gx58x"] Apr 20 10:02:40.145616 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:40.145340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c0820ec_79be_4122_9bc9_af1459969f09.slice/crio-be49a0e8325c5b8aff620b5cf0f8dab6c32a1c94c4b2e5db8db45c40efe50c30 WatchSource:0}: Error finding container be49a0e8325c5b8aff620b5cf0f8dab6c32a1c94c4b2e5db8db45c40efe50c30: Status 404 returned error can't find the container with id be49a0e8325c5b8aff620b5cf0f8dab6c32a1c94c4b2e5db8db45c40efe50c30 Apr 20 10:02:40.251869 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.251840 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls"] Apr 20 10:02:40.606766 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:40.606730 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb9843a_089e_47f2_91c3_e7ca46156663.slice/crio-87a5f99e01e5bda530f8b83c28281f2edcabba9b7cff0d5877c33bf5e5e99934 WatchSource:0}: Error finding container 87a5f99e01e5bda530f8b83c28281f2edcabba9b7cff0d5877c33bf5e5e99934: Status 404 returned error can't find the container with id 87a5f99e01e5bda530f8b83c28281f2edcabba9b7cff0d5877c33bf5e5e99934 Apr 20 10:02:40.874974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.874949 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"b2c0d4a5c67c13d1cf9923be05bb3a103b6551c8000a0d3e78cdae1498a3590b"} Apr 20 10:02:40.875333 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.874984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"11e93ac67573a62edd21203df0c20b221dbd66502a9a5b4bb66c3f757c8eaae9"} Apr 20 10:02:40.875333 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.874994 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"222b5f736d4f6e900765c27719462d607c8a5ada6fbcc9ce7b735d6d6fb38d79"} Apr 20 10:02:40.877927 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.877900 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34"} Apr 20 10:02:40.878030 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.877935 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9"} Apr 20 10:02:40.878030 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.877948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7"} Apr 20 10:02:40.879605 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.879539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" event={"ID":"7bb9843a-089e-47f2-91c3-e7ca46156663","Type":"ContainerStarted","Data":"87a5f99e01e5bda530f8b83c28281f2edcabba9b7cff0d5877c33bf5e5e99934"} Apr 20 10:02:40.880713 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:40.880686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" event={"ID":"1c0820ec-79be-4122-9bc9-af1459969f09","Type":"ContainerStarted","Data":"be49a0e8325c5b8aff620b5cf0f8dab6c32a1c94c4b2e5db8db45c40efe50c30"} Apr 20 10:02:41.886983 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:41.886946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb"} Apr 20 10:02:41.886983 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:41.886980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2"} Apr 20 10:02:42.786265 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.786239 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b68f8c6f9-rrmkb" Apr 20 10:02:42.892614 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.892537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"85a7169373ba43fd8f410a27cac54474176bd1d89d773709fd2d0c9dc5f0cf6c"} Apr 20 10:02:42.892614 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.892574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"982cc058085ac585166c0fe36984f6866b7fe060391c4ae021af6ab1cc6b4928"} Apr 20 10:02:42.892614 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.892590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" event={"ID":"db351594-bd1d-4f30-a5a3-d8f3e7f8864b","Type":"ContainerStarted","Data":"51830f1ff6b81fcd79b15eccaf393f466f383f7360b09450e511f7fc1e38ea90"} Apr 20 10:02:42.893116 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.892703 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:42.895622 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.895597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerStarted","Data":"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe"} Apr 20 10:02:42.896972 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.896951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" event={"ID":"7bb9843a-089e-47f2-91c3-e7ca46156663","Type":"ContainerStarted","Data":"a976209437c12248debbe96b69b501724457093a44f28823009d056bea8591db"} Apr 20 10:02:42.897175 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.897154 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:42.898389 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.898359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" event={"ID":"1c0820ec-79be-4122-9bc9-af1459969f09","Type":"ContainerStarted","Data":"ef1f071032d8826354cb085ff2f0885a9e86f932d04beff647b6355b500139d5"} Apr 20 10:02:42.901827 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.901808 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" Apr 20 10:02:42.913812 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.913768 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" podStartSLOduration=1.256528876 podStartE2EDuration="5.91375588s" podCreationTimestamp="2026-04-20 10:02:37 +0000 UTC" firstStartedPulling="2026-04-20 10:02:37.752142593 +0000 UTC m=+71.772988795" lastFinishedPulling="2026-04-20 10:02:42.409369593 +0000 UTC m=+76.430215799" observedRunningTime="2026-04-20 10:02:42.912552335 +0000 UTC m=+76.933398558" watchObservedRunningTime="2026-04-20 10:02:42.91375588 +0000 UTC m=+76.934602117" Apr 20 10:02:42.931644 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.931610 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" podStartSLOduration=1.6700120790000001 podStartE2EDuration="3.931598355s" podCreationTimestamp="2026-04-20 10:02:39 +0000 UTC" firstStartedPulling="2026-04-20 10:02:40.147780839 +0000 UTC m=+74.168627045" lastFinishedPulling="2026-04-20 10:02:42.409367115 +0000 UTC m=+76.430213321" observedRunningTime="2026-04-20 10:02:42.929863435 +0000 UTC m=+76.950709658" watchObservedRunningTime="2026-04-20 10:02:42.931598355 +0000 UTC m=+76.952444579" Apr 20 10:02:42.948937 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.948904 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-w8jls" podStartSLOduration=2.145363223 podStartE2EDuration="3.948893427s" podCreationTimestamp="2026-04-20 10:02:39 +0000 UTC" firstStartedPulling="2026-04-20 10:02:40.609496336 +0000 UTC m=+74.630342550" lastFinishedPulling="2026-04-20 10:02:42.413026551 +0000 UTC m=+76.433872754" observedRunningTime="2026-04-20 10:02:42.947602062 +0000 UTC m=+76.968448284" watchObservedRunningTime="2026-04-20 10:02:42.948893427 +0000 UTC m=+76.969739651" Apr 20 10:02:42.975335 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:42.975025 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.87250913 podStartE2EDuration="6.975010899s" podCreationTimestamp="2026-04-20 10:02:36 +0000 UTC" firstStartedPulling="2026-04-20 10:02:37.307050919 +0000 UTC m=+71.327897128" lastFinishedPulling="2026-04-20 10:02:42.409552689 +0000 UTC m=+76.430398897" observedRunningTime="2026-04-20 10:02:42.973725831 +0000 UTC m=+76.994572066" watchObservedRunningTime="2026-04-20 10:02:42.975010899 +0000 UTC m=+76.995857124" Apr 20 10:02:45.905069 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.905032 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rfsz8"] Apr 20 10:02:45.911396 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.911372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:02:45.913605 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.913581 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-phkcc\"" Apr 20 10:02:45.913709 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.913625 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 10:02:45.913709 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.913584 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 10:02:45.916611 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:45.916587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rfsz8"] Apr 20 10:02:46.020174 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.020145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjntc\" (UniqueName: \"kubernetes.io/projected/d6698a90-789d-4e6d-89f3-76b2282e07f7-kube-api-access-sjntc\") pod \"downloads-6bcc868b7-rfsz8\" (UID: \"d6698a90-789d-4e6d-89f3-76b2282e07f7\") " pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:02:46.121279 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.121253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjntc\" (UniqueName: \"kubernetes.io/projected/d6698a90-789d-4e6d-89f3-76b2282e07f7-kube-api-access-sjntc\") pod \"downloads-6bcc868b7-rfsz8\" (UID: \"d6698a90-789d-4e6d-89f3-76b2282e07f7\") " pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:02:46.144183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.144159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjntc\" (UniqueName: \"kubernetes.io/projected/d6698a90-789d-4e6d-89f3-76b2282e07f7-kube-api-access-sjntc\") pod \"downloads-6bcc868b7-rfsz8\" (UID: \"d6698a90-789d-4e6d-89f3-76b2282e07f7\") " pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:02:46.222000 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.221934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:02:46.354300 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.354277 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rfsz8"] Apr 20 10:02:46.356682 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:02:46.356653 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6698a90_789d_4e6d_89f3_76b2282e07f7.slice/crio-051a1e8ffe1e12db5a31644eb164560f5506de1c67ba56f971c1a0a8878fbec5 WatchSource:0}: Error finding container 051a1e8ffe1e12db5a31644eb164560f5506de1c67ba56f971c1a0a8878fbec5: Status 404 returned error can't find the container with id 051a1e8ffe1e12db5a31644eb164560f5506de1c67ba56f971c1a0a8878fbec5 Apr 20 10:02:46.914068 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:46.914040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rfsz8" event={"ID":"d6698a90-789d-4e6d-89f3-76b2282e07f7","Type":"ContainerStarted","Data":"051a1e8ffe1e12db5a31644eb164560f5506de1c67ba56f971c1a0a8878fbec5"} Apr 20 10:02:48.908500 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:48.908471 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-d5f699f68-zgg4r" Apr 20 10:02:56.214897 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.214863 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:02:56.219638 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.219618 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.222111 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222087 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 10:02:56.222207 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 10:02:56.222273 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222242 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 10:02:56.222273 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 10:02:56.222395 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222382 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 10:02:56.222956 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.222936 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-dt7j8\"" Apr 20 10:02:56.228001 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.227971 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:02:56.312636 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.312781 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312651 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.312781 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.312781 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312754 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.312942 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.312942 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.312853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnfl2\" (UniqueName: \"kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414066 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414041 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414108 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414377 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnfl2\" (UniqueName: \"kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414852 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.414965 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.414825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.415160 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.415131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.416799 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.416777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.416887 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.416813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.424602 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.424575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnfl2\" (UniqueName: \"kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2\") pod \"console-857bd89697-dnh4j\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:56.532570 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:56.532546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:02:59.652080 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:59.652041 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:02:59.652552 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:02:59.652099 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:03:02.078454 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.078429 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:03:02.083079 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:03:02.083052 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a9aaf8d_5053_4658_a4db_447e96d9f96d.slice/crio-9617030a44b5d7481ea5ed459fe5853e3922fe1e852465feeb9c5728b5c7d720 WatchSource:0}: Error finding container 9617030a44b5d7481ea5ed459fe5853e3922fe1e852465feeb9c5728b5c7d720: Status 404 returned error can't find the container with id 9617030a44b5d7481ea5ed459fe5853e3922fe1e852465feeb9c5728b5c7d720 Apr 20 10:03:02.971481 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.971423 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rfsz8" event={"ID":"d6698a90-789d-4e6d-89f3-76b2282e07f7","Type":"ContainerStarted","Data":"815fbc8725fe7143b5c971b0dd4bdbbf7ef4f99219f668a6207ba3994a50effd"} Apr 20 10:03:02.971828 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.971709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:03:02.973810 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.973773 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857bd89697-dnh4j" event={"ID":"0a9aaf8d-5053-4658-a4db-447e96d9f96d","Type":"ContainerStarted","Data":"9617030a44b5d7481ea5ed459fe5853e3922fe1e852465feeb9c5728b5c7d720"} Apr 20 10:03:02.990114 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.989945 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rfsz8" Apr 20 10:03:02.990497 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:02.990445 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rfsz8" podStartSLOduration=2.2819166810000002 podStartE2EDuration="17.990430075s" podCreationTimestamp="2026-04-20 10:02:45 +0000 UTC" firstStartedPulling="2026-04-20 10:02:46.358841866 +0000 UTC m=+80.379688068" lastFinishedPulling="2026-04-20 10:03:02.067355257 +0000 UTC m=+96.088201462" observedRunningTime="2026-04-20 10:03:02.988910355 +0000 UTC m=+97.009756573" watchObservedRunningTime="2026-04-20 10:03:02.990430075 +0000 UTC m=+97.011276302" Apr 20 10:03:05.293689 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:05.293581 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f6m7h_033bb25f-40eb-4a7f-ad48-e552fca86c6d/serve-healthcheck-canary/0.log" Apr 20 10:03:06.014769 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.014741 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:03:06.047737 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.047649 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:03:06.047832 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.047771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.057087 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.057062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 10:03:06.203829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.203974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.203974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.203974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.203974 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.204143 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.203984 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.204143 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.204013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59lf\" (UniqueName: \"kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.304625 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304744 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b59lf\" (UniqueName: \"kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305028 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.304838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305523 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.305499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305673 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.305646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305769 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.305534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.305769 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.305680 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.307729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.307704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.307729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.307716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.315201 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.315176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b59lf\" (UniqueName: \"kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf\") pod \"console-5c46b74b5d-4g96h\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.358230 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.358208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:06.503644 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.503575 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:03:06.507708 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:03:06.507680 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf58061_677c_44da_94a5_457c4ee5ec7e.slice/crio-a0af670bc1c74554c8e24aac847fd08454c649e9b016113e9e8e56ce4f617284 WatchSource:0}: Error finding container a0af670bc1c74554c8e24aac847fd08454c649e9b016113e9e8e56ce4f617284: Status 404 returned error can't find the container with id a0af670bc1c74554c8e24aac847fd08454c649e9b016113e9e8e56ce4f617284 Apr 20 10:03:06.844317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.844286 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b258f" Apr 20 10:03:06.989780 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.989741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c46b74b5d-4g96h" event={"ID":"faf58061-677c-44da-94a5-457c4ee5ec7e","Type":"ContainerStarted","Data":"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385"} Apr 20 10:03:06.989949 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.989785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c46b74b5d-4g96h" event={"ID":"faf58061-677c-44da-94a5-457c4ee5ec7e","Type":"ContainerStarted","Data":"a0af670bc1c74554c8e24aac847fd08454c649e9b016113e9e8e56ce4f617284"} Apr 20 10:03:06.991491 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:06.991460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857bd89697-dnh4j" event={"ID":"0a9aaf8d-5053-4658-a4db-447e96d9f96d","Type":"ContainerStarted","Data":"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d"} Apr 20 10:03:07.012491 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:07.012446 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c46b74b5d-4g96h" podStartSLOduration=2.012429377 podStartE2EDuration="2.012429377s" podCreationTimestamp="2026-04-20 10:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:03:07.010318665 +0000 UTC m=+101.031164885" watchObservedRunningTime="2026-04-20 10:03:07.012429377 +0000 UTC m=+101.033275615" Apr 20 10:03:07.032541 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:07.032496 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-857bd89697-dnh4j" podStartSLOduration=7.17237012 podStartE2EDuration="11.032485667s" podCreationTimestamp="2026-04-20 10:02:56 +0000 UTC" firstStartedPulling="2026-04-20 10:03:02.085258516 +0000 UTC m=+96.106104717" lastFinishedPulling="2026-04-20 10:03:05.945374054 +0000 UTC m=+99.966220264" observedRunningTime="2026-04-20 10:03:07.031463376 +0000 UTC m=+101.052309601" watchObservedRunningTime="2026-04-20 10:03:07.032485667 +0000 UTC m=+101.053331893" Apr 20 10:03:16.359211 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.359174 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:16.359730 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.359238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:16.364473 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.364215 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:16.542659 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.542631 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:16.542659 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.542664 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:16.544064 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:16.544042 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:17.030027 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:17.029998 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:03:17.030612 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:17.030593 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:17.108963 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:17.108931 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:03:19.658114 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:19.658088 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:03:19.663244 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:19.663196 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-668b98d794-gx58x" Apr 20 10:03:44.057388 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.057337 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-857bd89697-dnh4j" podUID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" containerName="console" containerID="cri-o://5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d" gracePeriod=15 Apr 20 10:03:44.328506 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.328484 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857bd89697-dnh4j_0a9aaf8d-5053-4658-a4db-447e96d9f96d/console/0.log" Apr 20 10:03:44.328617 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.328567 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:44.396436 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396413 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.396538 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396452 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.396538 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396489 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.396711 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396682 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.396822 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396733 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnfl2\" (UniqueName: \"kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.396887 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396829 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config" (OuterVolumeSpecName: "console-config") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:44.396887 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396873 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca" (OuterVolumeSpecName: "service-ca") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:44.397005 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.396887 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert\") pod \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\" (UID: \"0a9aaf8d-5053-4658-a4db-447e96d9f96d\") " Apr 20 10:03:44.397183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.397127 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-service-ca\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:44.397183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.397150 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:44.397324 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.397282 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:44.398757 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.398731 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2" (OuterVolumeSpecName: "kube-api-access-pnfl2") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "kube-api-access-pnfl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:44.398757 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.398746 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:44.398869 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.398801 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0a9aaf8d-5053-4658-a4db-447e96d9f96d" (UID: "0a9aaf8d-5053-4658-a4db-447e96d9f96d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:44.497655 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.497635 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-oauth-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:44.497655 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.497654 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a9aaf8d-5053-4658-a4db-447e96d9f96d-console-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:44.497777 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.497666 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnfl2\" (UniqueName: \"kubernetes.io/projected/0a9aaf8d-5053-4658-a4db-447e96d9f96d-kube-api-access-pnfl2\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:44.497777 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:44.497675 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0a9aaf8d-5053-4658-a4db-447e96d9f96d-oauth-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:45.110639 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110614 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857bd89697-dnh4j_0a9aaf8d-5053-4658-a4db-447e96d9f96d/console/0.log" Apr 20 10:03:45.110981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110653 2577 generic.go:358] "Generic (PLEG): container finished" podID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" containerID="5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d" exitCode=2 Apr 20 10:03:45.110981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110683 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857bd89697-dnh4j" event={"ID":"0a9aaf8d-5053-4658-a4db-447e96d9f96d","Type":"ContainerDied","Data":"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d"} Apr 20 10:03:45.110981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110715 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857bd89697-dnh4j" Apr 20 10:03:45.110981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857bd89697-dnh4j" event={"ID":"0a9aaf8d-5053-4658-a4db-447e96d9f96d","Type":"ContainerDied","Data":"9617030a44b5d7481ea5ed459fe5853e3922fe1e852465feeb9c5728b5c7d720"} Apr 20 10:03:45.110981 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.110740 2577 scope.go:117] "RemoveContainer" containerID="5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d" Apr 20 10:03:45.124005 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.123988 2577 scope.go:117] "RemoveContainer" containerID="5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d" Apr 20 10:03:45.124262 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:45.124232 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d\": container with ID starting with 5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d not found: ID does not exist" containerID="5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d" Apr 20 10:03:45.124429 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.124261 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d"} err="failed to get container status \"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d\": rpc error: code = NotFound desc = could not find container \"5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d\": container with ID starting with 5058036c59b7367d9d1f281ec96781fabdc00acae8bf98181d5bdd1bf288200d not found: ID does not exist" Apr 20 10:03:45.131145 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.131122 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:03:45.137590 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:45.137570 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-857bd89697-dnh4j"] Apr 20 10:03:46.537554 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:46.537524 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" path="/var/lib/kubelet/pods/0a9aaf8d-5053-4658-a4db-447e96d9f96d/volumes" Apr 20 10:03:55.473005 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.472967 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:55.473862 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.473829 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="alertmanager" containerID="cri-o://07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7" gracePeriod=120 Apr 20 10:03:55.474177 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.474082 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy" containerID="cri-o://349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2" gracePeriod=120 Apr 20 10:03:55.474266 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.474243 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="prom-label-proxy" containerID="cri-o://c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe" gracePeriod=120 Apr 20 10:03:55.474669 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.474277 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="config-reloader" containerID="cri-o://e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9" gracePeriod=120 Apr 20 10:03:55.474669 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.474357 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-metric" containerID="cri-o://dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb" gracePeriod=120 Apr 20 10:03:55.474669 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:55.474430 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-web" containerID="cri-o://bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34" gracePeriod=120 Apr 20 10:03:56.143932 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143897 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe" exitCode=0 Apr 20 10:03:56.143932 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143924 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb" exitCode=0 Apr 20 10:03:56.143932 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143931 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2" exitCode=0 Apr 20 10:03:56.143932 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143937 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9" exitCode=0 Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143943 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7" exitCode=0 Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe"} Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.143999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb"} Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.144011 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2"} Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.144020 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9"} Apr 20 10:03:56.144161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.144029 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7"} Apr 20 10:03:56.706290 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.706270 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:56.784035 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784008 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn2dp\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784179 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784054 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784179 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784086 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784179 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784130 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784179 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784161 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784208 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784240 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784278 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784327 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784411 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784370 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784662 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784429 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784662 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784462 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.784662 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.784490 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web\") pod \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\" (UID: \"39129b52-8ee4-42a2-8c1c-8b5ff26944d2\") " Apr 20 10:03:56.785399 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.785125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:56.786268 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.786238 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:03:56.786449 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.786425 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:03:56.786791 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.786710 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp" (OuterVolumeSpecName: "kube-api-access-nn2dp") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "kube-api-access-nn2dp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:56.787141 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.787101 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.787681 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.787642 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.788152 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.788123 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.789659 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.789607 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.789659 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.789616 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out" (OuterVolumeSpecName: "config-out") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:03:56.790869 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.790840 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.791252 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.791224 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:03:56.794733 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.794710 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.801336 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.801296 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config" (OuterVolumeSpecName: "web-config") pod "39129b52-8ee4-42a2-8c1c-8b5ff26944d2" (UID: "39129b52-8ee4-42a2-8c1c-8b5ff26944d2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:03:56.885888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885833 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.885888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885857 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.885888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885867 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-out\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.885888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885876 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.885888 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885885 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nn2dp\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-kube-api-access-nn2dp\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885895 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-config-volume\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885904 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-cluster-tls-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885913 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-alertmanager-main-db\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885922 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-tls-assets\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885930 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885939 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-secret-alertmanager-main-tls\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885948 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-web-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:56.886075 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:56.885958 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39129b52-8ee4-42a2-8c1c-8b5ff26944d2-metrics-client-ca\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:03:57.149267 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.149209 2577 generic.go:358] "Generic (PLEG): container finished" podID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerID="bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34" exitCode=0 Apr 20 10:03:57.149370 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.149287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34"} Apr 20 10:03:57.149370 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.149343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"39129b52-8ee4-42a2-8c1c-8b5ff26944d2","Type":"ContainerDied","Data":"6e2d68c546e27d0b1e6cb17b0bb89135f742ed06d51221ef3fd830ec1d961ef0"} Apr 20 10:03:57.149370 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.149350 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.149370 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.149361 2577 scope.go:117] "RemoveContainer" containerID="c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe" Apr 20 10:03:57.156801 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.156783 2577 scope.go:117] "RemoveContainer" containerID="dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb" Apr 20 10:03:57.163451 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.163431 2577 scope.go:117] "RemoveContainer" containerID="349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2" Apr 20 10:03:57.169593 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.169575 2577 scope.go:117] "RemoveContainer" containerID="bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34" Apr 20 10:03:57.173014 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.172989 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:57.176798 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.176784 2577 scope.go:117] "RemoveContainer" containerID="e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9" Apr 20 10:03:57.177672 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.177656 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:57.183108 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.183094 2577 scope.go:117] "RemoveContainer" containerID="07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7" Apr 20 10:03:57.189050 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.189033 2577 scope.go:117] "RemoveContainer" containerID="6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312" Apr 20 10:03:57.194914 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.194898 2577 scope.go:117] "RemoveContainer" containerID="c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe" Apr 20 10:03:57.195142 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.195127 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe\": container with ID starting with c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe not found: ID does not exist" containerID="c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe" Apr 20 10:03:57.195204 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195148 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe"} err="failed to get container status \"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe\": rpc error: code = NotFound desc = could not find container \"c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe\": container with ID starting with c5fed5f879f1eb760d2db457021990424b87c4d8d18244e94c38bf427d58f2fe not found: ID does not exist" Apr 20 10:03:57.195204 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195171 2577 scope.go:117] "RemoveContainer" containerID="dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb" Apr 20 10:03:57.195382 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.195364 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb\": container with ID starting with dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb not found: ID does not exist" containerID="dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb" Apr 20 10:03:57.195426 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195389 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb"} err="failed to get container status \"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb\": rpc error: code = NotFound desc = could not find container \"dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb\": container with ID starting with dc846a222b54f22d1a889557e82e1d61cc3432d7f988a52fae98d1c8adc0c8eb not found: ID does not exist" Apr 20 10:03:57.195426 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195405 2577 scope.go:117] "RemoveContainer" containerID="349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2" Apr 20 10:03:57.195625 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.195612 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2\": container with ID starting with 349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2 not found: ID does not exist" containerID="349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2" Apr 20 10:03:57.195670 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195626 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2"} err="failed to get container status \"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2\": rpc error: code = NotFound desc = could not find container \"349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2\": container with ID starting with 349c7d64f35f10198bbdf629a080b16060c58fcc419c4b25c450704ef8f694b2 not found: ID does not exist" Apr 20 10:03:57.195670 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195638 2577 scope.go:117] "RemoveContainer" containerID="bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34" Apr 20 10:03:57.195819 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.195804 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34\": container with ID starting with bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34 not found: ID does not exist" containerID="bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34" Apr 20 10:03:57.195861 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195825 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34"} err="failed to get container status \"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34\": rpc error: code = NotFound desc = could not find container \"bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34\": container with ID starting with bda81ddc6f6c18db7679a9d538a336fbd9fba7b0ab72fee775dc923208a9ef34 not found: ID does not exist" Apr 20 10:03:57.195861 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.195839 2577 scope.go:117] "RemoveContainer" containerID="e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9" Apr 20 10:03:57.196067 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.196050 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9\": container with ID starting with e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9 not found: ID does not exist" containerID="e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9" Apr 20 10:03:57.196117 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.196070 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9"} err="failed to get container status \"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9\": rpc error: code = NotFound desc = could not find container \"e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9\": container with ID starting with e5ab64029ebea63b0a3cc76e4163ea0d417b1f64be247e14d4a92566c0c28bd9 not found: ID does not exist" Apr 20 10:03:57.196117 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.196083 2577 scope.go:117] "RemoveContainer" containerID="07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7" Apr 20 10:03:57.196260 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.196246 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7\": container with ID starting with 07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7 not found: ID does not exist" containerID="07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7" Apr 20 10:03:57.196296 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.196262 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7"} err="failed to get container status \"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7\": rpc error: code = NotFound desc = could not find container \"07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7\": container with ID starting with 07c0ac85e06003066cb73477064b22d48bbca73cf08926428ff7ed980eb3fee7 not found: ID does not exist" Apr 20 10:03:57.196296 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.196273 2577 scope.go:117] "RemoveContainer" containerID="6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312" Apr 20 10:03:57.196510 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:03:57.196482 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312\": container with ID starting with 6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312 not found: ID does not exist" containerID="6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312" Apr 20 10:03:57.196566 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.196512 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312"} err="failed to get container status \"6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312\": rpc error: code = NotFound desc = could not find container \"6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312\": container with ID starting with 6e1bac2998ff6b0408e10c427ad48cb54922fd826483246081f692af4aa48312 not found: ID does not exist" Apr 20 10:03:57.211825 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.211807 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:57.212089 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212075 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-metric" Apr 20 10:03:57.212129 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212103 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-metric" Apr 20 10:03:57.212129 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212114 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="alertmanager" Apr 20 10:03:57.212129 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212119 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="alertmanager" Apr 20 10:03:57.212129 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212128 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="init-config-reloader" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212135 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="init-config-reloader" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212142 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212147 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212156 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="prom-label-proxy" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212164 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="prom-label-proxy" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212172 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" containerName="console" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212177 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" containerName="console" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212186 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="config-reloader" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212192 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="config-reloader" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212199 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-web" Apr 20 10:03:57.212247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212212 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-web" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212278 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-web" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212288 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy-metric" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212295 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0a9aaf8d-5053-4658-a4db-447e96d9f96d" containerName="console" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212315 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="alertmanager" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212324 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="kube-rbac-proxy" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212330 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="config-reloader" Apr 20 10:03:57.212567 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.212337 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" containerName="prom-label-proxy" Apr 20 10:03:57.217570 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.217554 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.221991 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.221968 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 10:03:57.222087 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222071 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 10:03:57.222239 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222222 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 10:03:57.222367 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222350 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 10:03:57.222830 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 10:03:57.222916 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222834 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 10:03:57.222916 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222840 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 10:03:57.223029 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222803 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 10:03:57.223029 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.222855 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lrsxl\"" Apr 20 10:03:57.229466 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.229438 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 10:03:57.241288 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.241270 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:57.289128 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289218 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289218 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289218 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289171 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-volume\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289390 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289247 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289390 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289282 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289390 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289320 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289390 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289539 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-out\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289539 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289462 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289539 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-web-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289539 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.289539 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.289523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wsw7\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-kube-api-access-4wsw7\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.390737 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-out\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.390831 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.390831 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390775 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-web-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.390831 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390790 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.390831 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wsw7\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-kube-api-access-4wsw7\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390864 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-volume\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.390981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.391010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391395 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.391038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391395 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.391194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.391503 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.391481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.392129 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.392103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f75c8fa0-09a4-4c62-aa67-d9762dbea003-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.393461 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.393436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-out\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.393929 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394180 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394371 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394236 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394371 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-web-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394371 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.394543 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.394527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.395387 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.395371 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f75c8fa0-09a4-4c62-aa67-d9762dbea003-config-volume\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.399655 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.399616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wsw7\" (UniqueName: \"kubernetes.io/projected/f75c8fa0-09a4-4c62-aa67-d9762dbea003-kube-api-access-4wsw7\") pod \"alertmanager-main-0\" (UID: \"f75c8fa0-09a4-4c62-aa67-d9762dbea003\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.527233 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.527206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 10:03:57.657845 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:57.657785 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 10:03:57.666132 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:03:57.666105 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75c8fa0_09a4_4c62_aa67_d9762dbea003.slice/crio-6f1f0d8cb2c90f886cc1da96ac80ad99deedcf324442804720f3f3465c2c2ff8 WatchSource:0}: Error finding container 6f1f0d8cb2c90f886cc1da96ac80ad99deedcf324442804720f3f3465c2c2ff8: Status 404 returned error can't find the container with id 6f1f0d8cb2c90f886cc1da96ac80ad99deedcf324442804720f3f3465c2c2ff8 Apr 20 10:03:58.154613 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:58.154587 2577 generic.go:358] "Generic (PLEG): container finished" podID="f75c8fa0-09a4-4c62-aa67-d9762dbea003" containerID="0c49c8b0961cfbefd956f4059e2377c203e7799f91dbe70207cfd8b1a4f13ebe" exitCode=0 Apr 20 10:03:58.154905 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:58.154633 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerDied","Data":"0c49c8b0961cfbefd956f4059e2377c203e7799f91dbe70207cfd8b1a4f13ebe"} Apr 20 10:03:58.154905 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:58.154652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"6f1f0d8cb2c90f886cc1da96ac80ad99deedcf324442804720f3f3465c2c2ff8"} Apr 20 10:03:58.538786 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:58.538763 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39129b52-8ee4-42a2-8c1c-8b5ff26944d2" path="/var/lib/kubelet/pods/39129b52-8ee4-42a2-8c1c-8b5ff26944d2/volumes" Apr 20 10:03:59.160202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160171 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"b9b370c954779aea1f570e382b95949089297e1423816e6bdf31ba12c5b8eb48"} Apr 20 10:03:59.160202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160206 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"b903156184d4c5421598540b6d17245a865d783ff2eb6732faf15227497acfb4"} Apr 20 10:03:59.160615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"09cf72bc4f9b2b2287b1e1d0cfc24852bfd2181d341441c1398a5ce07f44d256"} Apr 20 10:03:59.160615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160230 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"4f044cda7c71d93bbd8d9274c2a392b016abefd2ad55872a3bfae7389d31e93a"} Apr 20 10:03:59.160615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160239 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"48dc9a14b5eebc0e0ea1f8c8725232582948db772f2bca8b40d0fa11e7328791"} Apr 20 10:03:59.160615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.160247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f75c8fa0-09a4-4c62-aa67-d9762dbea003","Type":"ContainerStarted","Data":"ce20a51686f8aba5e1d55438f4a4a6ede1ed96e565598df49f76f3e56f0ae244"} Apr 20 10:03:59.186294 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.186247 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.18623423 podStartE2EDuration="2.18623423s" podCreationTimestamp="2026-04-20 10:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:03:59.184259869 +0000 UTC m=+153.205106094" watchObservedRunningTime="2026-04-20 10:03:59.18623423 +0000 UTC m=+153.207080461" Apr 20 10:03:59.439980 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.439895 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-687d956bd-rcphn"] Apr 20 10:03:59.444266 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.444243 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.447229 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.447208 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 10:03:59.447644 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.447613 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 10:03:59.447754 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.447712 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g4sg2\"" Apr 20 10:03:59.448011 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.447988 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 10:03:59.448108 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.448067 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 10:03:59.449872 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.449853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 10:03:59.454803 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.454766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 10:03:59.456621 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.456603 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-687d956bd-rcphn"] Apr 20 10:03:59.508080 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-federate-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508200 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508089 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508200 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cg92\" (UniqueName: \"kubernetes.io/projected/00471395-7962-401e-82db-0a745547e86a-kube-api-access-6cg92\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508200 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-serving-certs-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508200 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-telemeter-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508383 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508383 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-metrics-client-ca\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.508383 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.508380 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609038 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-federate-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cg92\" (UniqueName: \"kubernetes.io/projected/00471395-7962-401e-82db-0a745547e86a-kube-api-access-6cg92\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-serving-certs-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-telemeter-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609476 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609476 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-metrics-client-ca\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.609973 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.609943 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-metrics-client-ca\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.610218 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.610195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-telemeter-trusted-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.611738 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.611638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00471395-7962-401e-82db-0a745547e86a-serving-certs-ca-bundle\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.611997 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.611932 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-federate-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.612260 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.612243 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-telemeter-client-tls\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.612439 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.612415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.614138 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.614101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00471395-7962-401e-82db-0a745547e86a-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.617151 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.617125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cg92\" (UniqueName: \"kubernetes.io/projected/00471395-7962-401e-82db-0a745547e86a-kube-api-access-6cg92\") pod \"telemeter-client-687d956bd-rcphn\" (UID: \"00471395-7962-401e-82db-0a745547e86a\") " pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.756054 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.756024 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" Apr 20 10:03:59.899842 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:03:59.899818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-687d956bd-rcphn"] Apr 20 10:03:59.902117 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:03:59.902089 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00471395_7962_401e_82db_0a745547e86a.slice/crio-1099b32f222d63220a452f85f171b9c865e60dd14f7f8740741257d457643f1f WatchSource:0}: Error finding container 1099b32f222d63220a452f85f171b9c865e60dd14f7f8740741257d457643f1f: Status 404 returned error can't find the container with id 1099b32f222d63220a452f85f171b9c865e60dd14f7f8740741257d457643f1f Apr 20 10:04:00.165138 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:00.165050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" event={"ID":"00471395-7962-401e-82db-0a745547e86a","Type":"ContainerStarted","Data":"1099b32f222d63220a452f85f171b9c865e60dd14f7f8740741257d457643f1f"} Apr 20 10:04:02.173099 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:02.173069 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" event={"ID":"00471395-7962-401e-82db-0a745547e86a","Type":"ContainerStarted","Data":"7142be1c8cafb8946a9f9dd4c2f4a92168306af763eebd1b17bb8071e962d863"} Apr 20 10:04:03.178025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:03.177991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" event={"ID":"00471395-7962-401e-82db-0a745547e86a","Type":"ContainerStarted","Data":"f362854653774584caefe348fe74d7cc90a2b667e7094995dc95d05103f45b20"} Apr 20 10:04:03.178025 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:03.178026 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" event={"ID":"00471395-7962-401e-82db-0a745547e86a","Type":"ContainerStarted","Data":"6d28e81e9c136e408032b9ca20e3324fbda1885a6509302a5de1c947256e05b5"} Apr 20 10:04:03.203726 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:03.203671 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-687d956bd-rcphn" podStartSLOduration=2.018659246 podStartE2EDuration="4.203655307s" podCreationTimestamp="2026-04-20 10:03:59 +0000 UTC" firstStartedPulling="2026-04-20 10:03:59.904483168 +0000 UTC m=+153.925329372" lastFinishedPulling="2026-04-20 10:04:02.08947923 +0000 UTC m=+156.110325433" observedRunningTime="2026-04-20 10:04:03.201661708 +0000 UTC m=+157.222507932" watchObservedRunningTime="2026-04-20 10:04:03.203655307 +0000 UTC m=+157.224501531" Apr 20 10:04:04.017783 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.017753 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:04:04.021023 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.021004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.033061 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.033039 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:04:04.147763 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.147871 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147781 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.147871 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.147871 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krbhx\" (UniqueName: \"kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.147871 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.148000 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.148000 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.147956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.248875 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.248847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249234 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249234 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249342 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249386 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249359 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249436 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krbhx\" (UniqueName: \"kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.249488 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.249443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.250138 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.250102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.250484 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.250461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.250651 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.250628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.250747 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.250731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.251547 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.251526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.251924 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.251907 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.258271 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.258252 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krbhx\" (UniqueName: \"kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx\") pod \"console-57694c44f6-4bm8l\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.330253 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.330202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:04.448970 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:04.448947 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:04:04.451406 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:04:04.451369 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8593538_b149_49a4_adf7_188d98db185b.slice/crio-8dc64715c10b04dec25d940e5f1e18bdae52932179d3dbae641250d0df3e333e WatchSource:0}: Error finding container 8dc64715c10b04dec25d940e5f1e18bdae52932179d3dbae641250d0df3e333e: Status 404 returned error can't find the container with id 8dc64715c10b04dec25d940e5f1e18bdae52932179d3dbae641250d0df3e333e Apr 20 10:04:05.185403 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:05.185321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57694c44f6-4bm8l" event={"ID":"b8593538-b149-49a4-adf7-188d98db185b","Type":"ContainerStarted","Data":"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c"} Apr 20 10:04:05.185403 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:05.185359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57694c44f6-4bm8l" event={"ID":"b8593538-b149-49a4-adf7-188d98db185b","Type":"ContainerStarted","Data":"8dc64715c10b04dec25d940e5f1e18bdae52932179d3dbae641250d0df3e333e"} Apr 20 10:04:05.207187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:05.207132 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57694c44f6-4bm8l" podStartSLOduration=2.207118297 podStartE2EDuration="2.207118297s" podCreationTimestamp="2026-04-20 10:04:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:04:05.205630637 +0000 UTC m=+159.226476861" watchObservedRunningTime="2026-04-20 10:04:05.207118297 +0000 UTC m=+159.227964516" Apr 20 10:04:14.330900 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:14.330865 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:14.330900 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:14.330904 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:14.335626 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:14.335603 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:15.217747 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:15.217720 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:04:15.313320 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:15.313274 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:04:40.331994 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.331926 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c46b74b5d-4g96h" podUID="faf58061-677c-44da-94a5-457c4ee5ec7e" containerName="console" containerID="cri-o://8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385" gracePeriod=15 Apr 20 10:04:40.563610 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.563589 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c46b74b5d-4g96h_faf58061-677c-44da-94a5-457c4ee5ec7e/console/0.log" Apr 20 10:04:40.563709 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.563648 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:04:40.724208 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724146 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b59lf\" (UniqueName: \"kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724208 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724181 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724393 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724217 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724393 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724326 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724494 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724407 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724494 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724456 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724494 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724490 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config\") pod \"faf58061-677c-44da-94a5-457c4ee5ec7e\" (UID: \"faf58061-677c-44da-94a5-457c4ee5ec7e\") " Apr 20 10:04:40.724716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724666 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config" (OuterVolumeSpecName: "console-config") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:40.724716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724681 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:40.724716 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724696 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca" (OuterVolumeSpecName: "service-ca") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:40.724950 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724804 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-console-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.724950 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724821 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-oauth-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.724950 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724830 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-service-ca\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.724950 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.724838 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:04:40.726278 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.726252 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf" (OuterVolumeSpecName: "kube-api-access-b59lf") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "kube-api-access-b59lf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:04:40.726486 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.726464 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:40.726552 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.726482 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "faf58061-677c-44da-94a5-457c4ee5ec7e" (UID: "faf58061-677c-44da-94a5-457c4ee5ec7e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:04:40.825962 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.825944 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.825962 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.825962 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/faf58061-677c-44da-94a5-457c4ee5ec7e-console-oauth-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.826073 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.825974 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b59lf\" (UniqueName: \"kubernetes.io/projected/faf58061-677c-44da-94a5-457c4ee5ec7e-kube-api-access-b59lf\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:40.826073 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:40.825984 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/faf58061-677c-44da-94a5-457c4ee5ec7e-trusted-ca-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:04:41.294917 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.294892 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c46b74b5d-4g96h_faf58061-677c-44da-94a5-457c4ee5ec7e/console/0.log" Apr 20 10:04:41.295020 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.294932 2577 generic.go:358] "Generic (PLEG): container finished" podID="faf58061-677c-44da-94a5-457c4ee5ec7e" containerID="8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385" exitCode=2 Apr 20 10:04:41.295020 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.295005 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c46b74b5d-4g96h" Apr 20 10:04:41.295020 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.295013 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c46b74b5d-4g96h" event={"ID":"faf58061-677c-44da-94a5-457c4ee5ec7e","Type":"ContainerDied","Data":"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385"} Apr 20 10:04:41.295118 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.295040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c46b74b5d-4g96h" event={"ID":"faf58061-677c-44da-94a5-457c4ee5ec7e","Type":"ContainerDied","Data":"a0af670bc1c74554c8e24aac847fd08454c649e9b016113e9e8e56ce4f617284"} Apr 20 10:04:41.295118 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.295055 2577 scope.go:117] "RemoveContainer" containerID="8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385" Apr 20 10:04:41.303747 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.303730 2577 scope.go:117] "RemoveContainer" containerID="8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385" Apr 20 10:04:41.304036 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:04:41.304017 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385\": container with ID starting with 8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385 not found: ID does not exist" containerID="8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385" Apr 20 10:04:41.304083 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.304044 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385"} err="failed to get container status \"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385\": rpc error: code = NotFound desc = could not find container \"8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385\": container with ID starting with 8b128ed737a836771f0ef06b34cd23fa8d8dc490f266ab5fb1d5c79fe2814385 not found: ID does not exist" Apr 20 10:04:41.317649 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.317624 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:04:41.323137 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:41.323112 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c46b74b5d-4g96h"] Apr 20 10:04:42.538168 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:04:42.538133 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faf58061-677c-44da-94a5-457c4ee5ec7e" path="/var/lib/kubelet/pods/faf58061-677c-44da-94a5-457c4ee5ec7e/volumes" Apr 20 10:05:19.991938 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:19.991900 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:05:19.992413 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:19.992236 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="faf58061-677c-44da-94a5-457c4ee5ec7e" containerName="console" Apr 20 10:05:19.992413 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:19.992247 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="faf58061-677c-44da-94a5-457c4ee5ec7e" containerName="console" Apr 20 10:05:19.992413 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:19.992318 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="faf58061-677c-44da-94a5-457c4ee5ec7e" containerName="console" Apr 20 10:05:19.995382 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:19.995356 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.010049 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.010023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:05:20.089538 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089511 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089644 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089644 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089760 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66xg9\" (UniqueName: \"kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089760 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089862 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.089862 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.089799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.190867 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.190838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.190979 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.190871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.190979 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.190897 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.190979 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.190940 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191139 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.190982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191139 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66xg9\" (UniqueName: \"kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191139 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191676 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191866 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191882 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.191941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.191921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.193636 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.193609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.193752 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.193735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.200433 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.200408 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66xg9\" (UniqueName: \"kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9\") pod \"console-74d95f456c-v4fcc\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.304906 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.304879 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:20.427647 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:20.427619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:05:20.430627 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:05:20.430605 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84146eb7_c871_4b19_be2a_5b9184c35fe5.slice/crio-943d5f196acbdf9e6189cbae73e3bb697eac8fdcbb487f1eb66e8946ecd47bd8 WatchSource:0}: Error finding container 943d5f196acbdf9e6189cbae73e3bb697eac8fdcbb487f1eb66e8946ecd47bd8: Status 404 returned error can't find the container with id 943d5f196acbdf9e6189cbae73e3bb697eac8fdcbb487f1eb66e8946ecd47bd8 Apr 20 10:05:21.415496 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:21.415460 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d95f456c-v4fcc" event={"ID":"84146eb7-c871-4b19-be2a-5b9184c35fe5","Type":"ContainerStarted","Data":"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac"} Apr 20 10:05:21.415496 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:21.415498 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d95f456c-v4fcc" event={"ID":"84146eb7-c871-4b19-be2a-5b9184c35fe5","Type":"ContainerStarted","Data":"943d5f196acbdf9e6189cbae73e3bb697eac8fdcbb487f1eb66e8946ecd47bd8"} Apr 20 10:05:21.435732 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:21.435687 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74d95f456c-v4fcc" podStartSLOduration=2.435671191 podStartE2EDuration="2.435671191s" podCreationTimestamp="2026-04-20 10:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:05:21.433482935 +0000 UTC m=+235.454329159" watchObservedRunningTime="2026-04-20 10:05:21.435671191 +0000 UTC m=+235.456517415" Apr 20 10:05:30.305130 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:30.305092 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:30.305130 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:30.305138 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:30.309506 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:30.309485 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:30.447712 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:30.447689 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:05:30.512856 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:30.512830 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:05:46.483173 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.483100 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-sdz58"] Apr 20 10:05:46.486223 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.486206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.488208 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.488188 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 10:05:46.510496 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.510477 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sdz58"] Apr 20 10:05:46.583843 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.583821 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-kubelet-config\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.583976 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.583862 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-dbus\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.583976 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.583906 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-original-pull-secret\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.684945 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.684920 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-kubelet-config\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.685063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.684952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-dbus\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.685063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.684978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-original-pull-secret\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.685063 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.685051 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-kubelet-config\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.685182 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.685117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-dbus\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.687012 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.686988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/39b2e908-eabf-4b4f-ae1d-5e46c3d244cf-original-pull-secret\") pod \"global-pull-secret-syncer-sdz58\" (UID: \"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf\") " pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.795388 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.795360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-sdz58" Apr 20 10:05:46.921970 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:46.921943 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-sdz58"] Apr 20 10:05:46.925870 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:05:46.925848 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b2e908_eabf_4b4f_ae1d_5e46c3d244cf.slice/crio-30b6c95df553636600e7374f1fdb4e9611b1f73b92fbdb1f5856ff6a73681843 WatchSource:0}: Error finding container 30b6c95df553636600e7374f1fdb4e9611b1f73b92fbdb1f5856ff6a73681843: Status 404 returned error can't find the container with id 30b6c95df553636600e7374f1fdb4e9611b1f73b92fbdb1f5856ff6a73681843 Apr 20 10:05:47.495124 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:47.495085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sdz58" event={"ID":"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf","Type":"ContainerStarted","Data":"30b6c95df553636600e7374f1fdb4e9611b1f73b92fbdb1f5856ff6a73681843"} Apr 20 10:05:51.508745 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:51.508708 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-sdz58" event={"ID":"39b2e908-eabf-4b4f-ae1d-5e46c3d244cf","Type":"ContainerStarted","Data":"131be8ba6c819ad526d13a9ac4700915c9036b60191c731a7c7caaa13b6a0445"} Apr 20 10:05:55.531813 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.531764 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57694c44f6-4bm8l" podUID="b8593538-b149-49a4-adf7-188d98db185b" containerName="console" containerID="cri-o://cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c" gracePeriod=15 Apr 20 10:05:55.767464 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.767441 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57694c44f6-4bm8l_b8593538-b149-49a4-adf7-188d98db185b/console/0.log" Apr 20 10:05:55.767569 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.767497 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:05:55.788604 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.788528 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-sdz58" podStartSLOduration=6.04692891 podStartE2EDuration="9.78851413s" podCreationTimestamp="2026-04-20 10:05:46 +0000 UTC" firstStartedPulling="2026-04-20 10:05:46.927738962 +0000 UTC m=+260.948585167" lastFinishedPulling="2026-04-20 10:05:50.669324181 +0000 UTC m=+264.690170387" observedRunningTime="2026-04-20 10:05:51.530888611 +0000 UTC m=+265.551734836" watchObservedRunningTime="2026-04-20 10:05:55.78851413 +0000 UTC m=+269.809360354" Apr 20 10:05:55.859206 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859184 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859352 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859222 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859352 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859474 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859444 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krbhx\" (UniqueName: \"kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859530 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859518 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859585 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859548 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859638 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859597 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle\") pod \"b8593538-b149-49a4-adf7-188d98db185b\" (UID: \"b8593538-b149-49a4-adf7-188d98db185b\") " Apr 20 10:05:55.859638 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859604 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config" (OuterVolumeSpecName: "console-config") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:05:55.859726 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca" (OuterVolumeSpecName: "service-ca") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:05:55.859726 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859651 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:05:55.859998 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859975 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:05:55.859998 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.859993 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-oauth-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.860123 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.860011 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-service-ca\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.860123 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.860023 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-console-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.861685 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.861657 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:05:55.861780 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.861661 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx" (OuterVolumeSpecName: "kube-api-access-krbhx") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "kube-api-access-krbhx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:05:55.861780 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.861744 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b8593538-b149-49a4-adf7-188d98db185b" (UID: "b8593538-b149-49a4-adf7-188d98db185b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:05:55.961161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.961128 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-krbhx\" (UniqueName: \"kubernetes.io/projected/b8593538-b149-49a4-adf7-188d98db185b-kube-api-access-krbhx\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.961161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.961160 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.961161 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.961171 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b8593538-b149-49a4-adf7-188d98db185b-console-oauth-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:55.961348 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:55.961180 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8593538-b149-49a4-adf7-188d98db185b-trusted-ca-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:05:56.528529 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57694c44f6-4bm8l_b8593538-b149-49a4-adf7-188d98db185b/console/0.log" Apr 20 10:05:56.528657 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528538 2577 generic.go:358] "Generic (PLEG): container finished" podID="b8593538-b149-49a4-adf7-188d98db185b" containerID="cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c" exitCode=2 Apr 20 10:05:56.528657 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528616 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57694c44f6-4bm8l" Apr 20 10:05:56.528657 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528621 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57694c44f6-4bm8l" event={"ID":"b8593538-b149-49a4-adf7-188d98db185b","Type":"ContainerDied","Data":"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c"} Apr 20 10:05:56.528810 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57694c44f6-4bm8l" event={"ID":"b8593538-b149-49a4-adf7-188d98db185b","Type":"ContainerDied","Data":"8dc64715c10b04dec25d940e5f1e18bdae52932179d3dbae641250d0df3e333e"} Apr 20 10:05:56.528810 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.528682 2577 scope.go:117] "RemoveContainer" containerID="cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c" Apr 20 10:05:56.538616 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.538374 2577 scope.go:117] "RemoveContainer" containerID="cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c" Apr 20 10:05:56.538904 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:05:56.538861 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c\": container with ID starting with cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c not found: ID does not exist" containerID="cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c" Apr 20 10:05:56.538954 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.538892 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c"} err="failed to get container status \"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c\": rpc error: code = NotFound desc = could not find container \"cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c\": container with ID starting with cf5671328de4b142952bfd4923eef492ef44d700fdf7f441f8dd851c392d8f3c not found: ID does not exist" Apr 20 10:05:56.553108 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.553084 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:05:56.555815 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:56.555796 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57694c44f6-4bm8l"] Apr 20 10:05:58.538100 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:05:58.538061 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8593538-b149-49a4-adf7-188d98db185b" path="/var/lib/kubelet/pods/b8593538-b149-49a4-adf7-188d98db185b/volumes" Apr 20 10:06:26.326615 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.326572 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv"] Apr 20 10:06:26.327130 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.326928 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8593538-b149-49a4-adf7-188d98db185b" containerName="console" Apr 20 10:06:26.327130 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.326940 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8593538-b149-49a4-adf7-188d98db185b" containerName="console" Apr 20 10:06:26.327130 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.327017 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8593538-b149-49a4-adf7-188d98db185b" containerName="console" Apr 20 10:06:26.331049 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.331018 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.333568 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.333537 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hv62k\"" Apr 20 10:06:26.333692 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.333568 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 10:06:26.333940 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.333923 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 10:06:26.342090 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.342059 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv"] Apr 20 10:06:26.374712 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.374683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.374814 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.374725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.374814 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.374758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7txl\" (UniqueName: \"kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.431955 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.431932 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:06:26.435183 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.435164 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:06:26.436517 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.436500 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 10:06:26.475187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.475081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.475187 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.475138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7txl\" (UniqueName: \"kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.484328 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.475207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.484328 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.475497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.484328 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.476048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.485593 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.485578 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 10:06:26.495999 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.495985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 10:06:26.506429 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.506409 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7txl\" (UniqueName: \"kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.643663 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.643610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hv62k\"" Apr 20 10:06:26.652407 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.652373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:26.774535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:26.774508 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv"] Apr 20 10:06:26.778024 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:06:26.777998 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode567f54b_e261_4528_927f_4efa2fa5c096.slice/crio-bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16 WatchSource:0}: Error finding container bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16: Status 404 returned error can't find the container with id bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16 Apr 20 10:06:27.620922 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:27.620887 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" event={"ID":"e567f54b-e261-4528-927f-4efa2fa5c096","Type":"ContainerStarted","Data":"bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16"} Apr 20 10:06:32.637045 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:32.637011 2577 generic.go:358] "Generic (PLEG): container finished" podID="e567f54b-e261-4528-927f-4efa2fa5c096" containerID="d1fb555529985f7a9ec9d0a1cb6832690751f318b602d88e1eddd982920f21d8" exitCode=0 Apr 20 10:06:32.637388 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:32.637051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" event={"ID":"e567f54b-e261-4528-927f-4efa2fa5c096","Type":"ContainerDied","Data":"d1fb555529985f7a9ec9d0a1cb6832690751f318b602d88e1eddd982920f21d8"} Apr 20 10:06:32.637957 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:32.637941 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:06:35.649074 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:35.649042 2577 generic.go:358] "Generic (PLEG): container finished" podID="e567f54b-e261-4528-927f-4efa2fa5c096" containerID="7efc44848dce44ab0ce534b6a5e9029c4261a0f98a9fdfe712c298c6ce4e3be5" exitCode=0 Apr 20 10:06:35.649431 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:35.649128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" event={"ID":"e567f54b-e261-4528-927f-4efa2fa5c096","Type":"ContainerDied","Data":"7efc44848dce44ab0ce534b6a5e9029c4261a0f98a9fdfe712c298c6ce4e3be5"} Apr 20 10:06:43.677314 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:43.677279 2577 generic.go:358] "Generic (PLEG): container finished" podID="e567f54b-e261-4528-927f-4efa2fa5c096" containerID="3be03a8f0df56fd21ea07e91023e4eb7e982fb22b39be1671040355a540aa51c" exitCode=0 Apr 20 10:06:43.677650 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:43.677344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" event={"ID":"e567f54b-e261-4528-927f-4efa2fa5c096","Type":"ContainerDied","Data":"3be03a8f0df56fd21ea07e91023e4eb7e982fb22b39be1671040355a540aa51c"} Apr 20 10:06:44.806335 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.806300 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:44.930549 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.930495 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util\") pod \"e567f54b-e261-4528-927f-4efa2fa5c096\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " Apr 20 10:06:44.930549 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.930537 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle\") pod \"e567f54b-e261-4528-927f-4efa2fa5c096\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " Apr 20 10:06:44.930705 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.930578 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7txl\" (UniqueName: \"kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl\") pod \"e567f54b-e261-4528-927f-4efa2fa5c096\" (UID: \"e567f54b-e261-4528-927f-4efa2fa5c096\") " Apr 20 10:06:44.931152 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.931125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle" (OuterVolumeSpecName: "bundle") pod "e567f54b-e261-4528-927f-4efa2fa5c096" (UID: "e567f54b-e261-4528-927f-4efa2fa5c096"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:06:44.932751 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.932723 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl" (OuterVolumeSpecName: "kube-api-access-f7txl") pod "e567f54b-e261-4528-927f-4efa2fa5c096" (UID: "e567f54b-e261-4528-927f-4efa2fa5c096"). InnerVolumeSpecName "kube-api-access-f7txl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:06:44.935661 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:44.935641 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util" (OuterVolumeSpecName: "util") pod "e567f54b-e261-4528-927f-4efa2fa5c096" (UID: "e567f54b-e261-4528-927f-4efa2fa5c096"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:06:45.031356 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.031335 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-util\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:06:45.031356 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.031355 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e567f54b-e261-4528-927f-4efa2fa5c096-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:06:45.031485 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.031364 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7txl\" (UniqueName: \"kubernetes.io/projected/e567f54b-e261-4528-927f-4efa2fa5c096-kube-api-access-f7txl\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:06:45.685316 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.685273 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" Apr 20 10:06:45.685462 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.685327 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5t22cv" event={"ID":"e567f54b-e261-4528-927f-4efa2fa5c096","Type":"ContainerDied","Data":"bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16"} Apr 20 10:06:45.685462 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:45.685356 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc7e01daf067655547c80bb20ccde4c3668b046e1e9520e3511d8a166c8ffa16" Apr 20 10:06:48.992851 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.992816 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s"] Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993159 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="extract" Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993170 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="extract" Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993221 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="pull" Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993227 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="pull" Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993238 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="util" Apr 20 10:06:48.993269 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993244 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="util" Apr 20 10:06:48.993484 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.993298 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e567f54b-e261-4528-927f-4efa2fa5c096" containerName="extract" Apr 20 10:06:48.998157 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:48.998138 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.001104 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.001077 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 10:06:49.001104 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.001094 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-p8l5b\"" Apr 20 10:06:49.001423 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.001117 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:06:49.008861 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.008838 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s"] Apr 20 10:06:49.057934 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.057912 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce66345c-44ab-4627-b21f-3b9c625ed435-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.058038 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.057960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbs5f\" (UniqueName: \"kubernetes.io/projected/ce66345c-44ab-4627-b21f-3b9c625ed435-kube-api-access-hbs5f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.159264 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.159236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbs5f\" (UniqueName: \"kubernetes.io/projected/ce66345c-44ab-4627-b21f-3b9c625ed435-kube-api-access-hbs5f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.159384 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.159295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce66345c-44ab-4627-b21f-3b9c625ed435-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.159629 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.159612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce66345c-44ab-4627-b21f-3b9c625ed435-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.168232 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.168207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbs5f\" (UniqueName: \"kubernetes.io/projected/ce66345c-44ab-4627-b21f-3b9c625ed435-kube-api-access-hbs5f\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-qhv2s\" (UID: \"ce66345c-44ab-4627-b21f-3b9c625ed435\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.309441 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.309408 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" Apr 20 10:06:49.649317 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.649284 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s"] Apr 20 10:06:49.651163 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:06:49.651135 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce66345c_44ab_4627_b21f_3b9c625ed435.slice/crio-cd5bef09af3a734e57fc16ee5b92208cc2119a700fb62902b25157596d5946ba WatchSource:0}: Error finding container cd5bef09af3a734e57fc16ee5b92208cc2119a700fb62902b25157596d5946ba: Status 404 returned error can't find the container with id cd5bef09af3a734e57fc16ee5b92208cc2119a700fb62902b25157596d5946ba Apr 20 10:06:49.697978 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:49.697947 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" event={"ID":"ce66345c-44ab-4627-b21f-3b9c625ed435","Type":"ContainerStarted","Data":"cd5bef09af3a734e57fc16ee5b92208cc2119a700fb62902b25157596d5946ba"} Apr 20 10:06:51.705890 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:51.705812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" event={"ID":"ce66345c-44ab-4627-b21f-3b9c625ed435","Type":"ContainerStarted","Data":"f06b43fe252ab6a24e6ce901b1b4f795822702d8e9d6faea0941f3c2ffb8d406"} Apr 20 10:06:51.727773 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:51.727728 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-qhv2s" podStartSLOduration=1.931322217 podStartE2EDuration="3.727714975s" podCreationTimestamp="2026-04-20 10:06:48 +0000 UTC" firstStartedPulling="2026-04-20 10:06:49.653539651 +0000 UTC m=+323.674385858" lastFinishedPulling="2026-04-20 10:06:51.449932402 +0000 UTC m=+325.470778616" observedRunningTime="2026-04-20 10:06:51.726150499 +0000 UTC m=+325.746996723" watchObservedRunningTime="2026-04-20 10:06:51.727714975 +0000 UTC m=+325.748561199" Apr 20 10:06:55.123805 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.123758 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crq9t"] Apr 20 10:06:55.127232 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.127205 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.129237 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.129212 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 10:06:55.129361 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.129347 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 10:06:55.129744 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.129725 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-k6vkl\"" Apr 20 10:06:55.135660 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.135636 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crq9t"] Apr 20 10:06:55.208361 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.208340 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4gpj\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-kube-api-access-q4gpj\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.208460 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.208382 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.309043 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.309020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4gpj\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-kube-api-access-q4gpj\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.309163 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.309060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.318585 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.318563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.318765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.318740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4gpj\" (UniqueName: \"kubernetes.io/projected/b19f5a81-6d00-43a0-b0fb-95acca919f14-kube-api-access-q4gpj\") pod \"cert-manager-webhook-597b96b99b-crq9t\" (UID: \"b19f5a81-6d00-43a0-b0fb-95acca919f14\") " pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.455387 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.455331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:55.580517 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.580490 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-crq9t"] Apr 20 10:06:55.582986 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:06:55.582957 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb19f5a81_6d00_43a0_b0fb_95acca919f14.slice/crio-87831bee446b885ee152c17650305330fc35923e8bbf5f627a1c9ada06efd155 WatchSource:0}: Error finding container 87831bee446b885ee152c17650305330fc35923e8bbf5f627a1c9ada06efd155: Status 404 returned error can't find the container with id 87831bee446b885ee152c17650305330fc35923e8bbf5f627a1c9ada06efd155 Apr 20 10:06:55.726951 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:55.726886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" event={"ID":"b19f5a81-6d00-43a0-b0fb-95acca919f14","Type":"ContainerStarted","Data":"87831bee446b885ee152c17650305330fc35923e8bbf5f627a1c9ada06efd155"} Apr 20 10:06:58.740475 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:58.740437 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" event={"ID":"b19f5a81-6d00-43a0-b0fb-95acca919f14","Type":"ContainerStarted","Data":"a85bafde3099c8ed92aeecd0b338eb2617f4a6e35bb60bc1d85a79424759f9e0"} Apr 20 10:06:58.740895 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:58.740506 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:06:58.759664 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:06:58.759613 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" podStartSLOduration=1.129688341 podStartE2EDuration="3.759598703s" podCreationTimestamp="2026-04-20 10:06:55 +0000 UTC" firstStartedPulling="2026-04-20 10:06:55.584884422 +0000 UTC m=+329.605730642" lastFinishedPulling="2026-04-20 10:06:58.214794791 +0000 UTC m=+332.235641004" observedRunningTime="2026-04-20 10:06:58.757069238 +0000 UTC m=+332.777915464" watchObservedRunningTime="2026-04-20 10:06:58.759598703 +0000 UTC m=+332.780444926" Apr 20 10:07:04.745902 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:04.745833 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-crq9t" Apr 20 10:07:05.554712 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.554679 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-54xx2"] Apr 20 10:07:05.558120 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.558099 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.560219 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.560198 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-cxqtj\"" Apr 20 10:07:05.568142 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.568119 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-54xx2"] Apr 20 10:07:05.682860 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.682839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprzw\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-kube-api-access-qprzw\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.682956 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.682888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-bound-sa-token\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.783563 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.783537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qprzw\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-kube-api-access-qprzw\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.783836 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.783584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-bound-sa-token\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.792054 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.792026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-bound-sa-token\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.792164 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.792097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprzw\" (UniqueName: \"kubernetes.io/projected/d90c0e7c-b455-47f4-8246-78492a8e0c8b-kube-api-access-qprzw\") pod \"cert-manager-759f64656b-54xx2\" (UID: \"d90c0e7c-b455-47f4-8246-78492a8e0c8b\") " pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.867723 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.867673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-54xx2" Apr 20 10:07:05.986318 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:05.986275 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-54xx2"] Apr 20 10:07:05.988528 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:07:05.988497 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90c0e7c_b455_47f4_8246_78492a8e0c8b.slice/crio-c9356f91c32e0f4fc522914253df582ca777a137b44d8632e21cfb04ee8be7f7 WatchSource:0}: Error finding container c9356f91c32e0f4fc522914253df582ca777a137b44d8632e21cfb04ee8be7f7: Status 404 returned error can't find the container with id c9356f91c32e0f4fc522914253df582ca777a137b44d8632e21cfb04ee8be7f7 Apr 20 10:07:06.775944 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:06.775907 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-54xx2" event={"ID":"d90c0e7c-b455-47f4-8246-78492a8e0c8b","Type":"ContainerStarted","Data":"b929694f0f6969b40cec77dd1b35bb4669e37fcf234833709a24ecf904997e1a"} Apr 20 10:07:06.775944 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:06.775940 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-54xx2" event={"ID":"d90c0e7c-b455-47f4-8246-78492a8e0c8b","Type":"ContainerStarted","Data":"c9356f91c32e0f4fc522914253df582ca777a137b44d8632e21cfb04ee8be7f7"} Apr 20 10:07:06.793461 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:06.793412 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-54xx2" podStartSLOduration=1.793398395 podStartE2EDuration="1.793398395s" podCreationTimestamp="2026-04-20 10:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:07:06.791994606 +0000 UTC m=+340.812840831" watchObservedRunningTime="2026-04-20 10:07:06.793398395 +0000 UTC m=+340.814244620" Apr 20 10:07:07.153913 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.153846 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx"] Apr 20 10:07:07.157491 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.157474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.159729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.159708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 10:07:07.159729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.159721 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 10:07:07.159864 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.159779 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-hv62k\"" Apr 20 10:07:07.165591 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.165570 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx"] Apr 20 10:07:07.296849 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.296828 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.296933 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.296863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9jp\" (UniqueName: \"kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.296988 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.296950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.397334 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.397293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.397430 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.397375 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.397430 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.397401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf9jp\" (UniqueName: \"kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.397708 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.397689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.397746 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.397715 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.406135 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.406078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf9jp\" (UniqueName: \"kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.468031 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.468010 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:07.588521 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.588494 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx"] Apr 20 10:07:07.590326 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:07:07.590275 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671b8a7a_4ba2_4864_ad1d_078e3ecc06cb.slice/crio-0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9 WatchSource:0}: Error finding container 0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9: Status 404 returned error can't find the container with id 0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9 Apr 20 10:07:07.780868 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.780836 2577 generic.go:358] "Generic (PLEG): container finished" podID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerID="bdd1cd08629d1ae4d6c8724537a456af13db82a06db456358e49507deffb3a66" exitCode=0 Apr 20 10:07:07.780996 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.780885 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" event={"ID":"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb","Type":"ContainerDied","Data":"bdd1cd08629d1ae4d6c8724537a456af13db82a06db456358e49507deffb3a66"} Apr 20 10:07:07.780996 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:07.780919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" event={"ID":"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb","Type":"ContainerStarted","Data":"0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9"} Apr 20 10:07:09.789530 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:09.789505 2577 generic.go:358] "Generic (PLEG): container finished" podID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerID="8ef34dc8d4c451b96837fbec59688faae45f2efab2b813d78d602172d4bbf40c" exitCode=0 Apr 20 10:07:09.789810 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:09.789541 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" event={"ID":"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb","Type":"ContainerDied","Data":"8ef34dc8d4c451b96837fbec59688faae45f2efab2b813d78d602172d4bbf40c"} Apr 20 10:07:10.794591 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:10.794556 2577 generic.go:358] "Generic (PLEG): container finished" podID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerID="ed97b886a1526d1a83893d26820c959d7fe6d894bebab0c3bf4b1bc78198c74d" exitCode=0 Apr 20 10:07:10.794919 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:10.794622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" event={"ID":"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb","Type":"ContainerDied","Data":"ed97b886a1526d1a83893d26820c959d7fe6d894bebab0c3bf4b1bc78198c74d"} Apr 20 10:07:11.920671 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:11.920648 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:12.038275 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.038251 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util\") pod \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " Apr 20 10:07:12.038394 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.038279 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf9jp\" (UniqueName: \"kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp\") pod \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " Apr 20 10:07:12.038394 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.038341 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle\") pod \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\" (UID: \"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb\") " Apr 20 10:07:12.038681 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.038654 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle" (OuterVolumeSpecName: "bundle") pod "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" (UID: "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:07:12.040246 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.040214 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp" (OuterVolumeSpecName: "kube-api-access-vf9jp") pod "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" (UID: "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb"). InnerVolumeSpecName "kube-api-access-vf9jp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:07:12.043502 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.043461 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util" (OuterVolumeSpecName: "util") pod "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" (UID: "671b8a7a-4ba2-4864-ad1d-078e3ecc06cb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 10:07:12.138954 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.138906 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-util\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:07:12.138954 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.138927 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vf9jp\" (UniqueName: \"kubernetes.io/projected/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-kube-api-access-vf9jp\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:07:12.138954 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.138938 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/671b8a7a-4ba2-4864-ad1d-078e3ecc06cb-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:07:12.803188 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.803139 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" Apr 20 10:07:12.803292 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.803137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78eq9xbx" event={"ID":"671b8a7a-4ba2-4864-ad1d-078e3ecc06cb","Type":"ContainerDied","Data":"0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9"} Apr 20 10:07:12.803292 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:12.803257 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e162f875e2b372e1b3d4e7d60ce14520a109ee22f986a31926dcf1b2a8263b9" Apr 20 10:07:39.285727 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.285692 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx"] Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286096 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="util" Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286109 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="util" Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286126 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="extract" Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286132 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="extract" Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286141 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="pull" Apr 20 10:07:39.286195 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286145 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="pull" Apr 20 10:07:39.286426 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.286217 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="671b8a7a-4ba2-4864-ad1d-078e3ecc06cb" containerName="extract" Apr 20 10:07:39.295355 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.295331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.298240 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298185 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 20 10:07:39.298387 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298297 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 20 10:07:39.298387 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298362 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 20 10:07:39.298514 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298435 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 20 10:07:39.298514 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298459 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 20 10:07:39.298711 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.298691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-6v7px\"" Apr 20 10:07:39.300837 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.300817 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx"] Apr 20 10:07:39.343144 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.343121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmf6r\" (UniqueName: \"kubernetes.io/projected/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-kube-api-access-rmf6r\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.343252 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.343151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-cert\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.343252 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.343173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-metrics-certs\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.343349 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.343275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-manager-config\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.444487 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.444462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmf6r\" (UniqueName: \"kubernetes.io/projected/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-kube-api-access-rmf6r\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.444581 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.444491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-cert\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.444581 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.444516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-metrics-certs\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.444581 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.444567 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-manager-config\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.445153 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.445133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-manager-config\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.446959 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.446936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-metrics-certs\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.447035 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.446983 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-cert\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.462738 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.462716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmf6r\" (UniqueName: \"kubernetes.io/projected/05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae-kube-api-access-rmf6r\") pod \"jobset-controller-manager-55bf455989-ts4gx\" (UID: \"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae\") " pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.605885 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.605831 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:39.747677 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.747650 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx"] Apr 20 10:07:39.750611 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:07:39.750586 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05fdcb4f_8174_49a8_8dc2_f59a0fdb22ae.slice/crio-2139f56b3d00b58ca01eb6c5a6cbf8b6b17a8c04e872dfe2f74e4da8dcc0e68b WatchSource:0}: Error finding container 2139f56b3d00b58ca01eb6c5a6cbf8b6b17a8c04e872dfe2f74e4da8dcc0e68b: Status 404 returned error can't find the container with id 2139f56b3d00b58ca01eb6c5a6cbf8b6b17a8c04e872dfe2f74e4da8dcc0e68b Apr 20 10:07:39.898765 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:39.898706 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" event={"ID":"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae","Type":"ContainerStarted","Data":"2139f56b3d00b58ca01eb6c5a6cbf8b6b17a8c04e872dfe2f74e4da8dcc0e68b"} Apr 20 10:07:42.913248 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:42.913213 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" event={"ID":"05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae","Type":"ContainerStarted","Data":"2544724edd27477cb1293bcec8ce6f9f56487b2838e20611b8b76c265071096f"} Apr 20 10:07:42.913607 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:42.913296 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:07:42.931793 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:42.931752 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" podStartSLOduration=1.546615791 podStartE2EDuration="3.9317391s" podCreationTimestamp="2026-04-20 10:07:39 +0000 UTC" firstStartedPulling="2026-04-20 10:07:39.752437982 +0000 UTC m=+373.773284184" lastFinishedPulling="2026-04-20 10:07:42.137561291 +0000 UTC m=+376.158407493" observedRunningTime="2026-04-20 10:07:42.929584985 +0000 UTC m=+376.950431232" watchObservedRunningTime="2026-04-20 10:07:42.9317391 +0000 UTC m=+376.952585363" Apr 20 10:07:53.922300 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:07:53.922273 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-55bf455989-ts4gx" Apr 20 10:09:25.036229 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.036193 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f7696b5f8-ggmss"] Apr 20 10:09:25.039841 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.039818 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.055284 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.055261 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f7696b5f8-ggmss"] Apr 20 10:09:25.131636 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-oauth-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131725 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-service-ca\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131725 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-trusted-ca-bundle\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131725 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-oauth-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131828 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131828 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-console-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.131828 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.131824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f26z\" (UniqueName: \"kubernetes.io/projected/a6a99658-d48c-41ab-8845-5c02440214aa-kube-api-access-7f26z\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.232936 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.232912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-console-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233038 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.232939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f26z\" (UniqueName: \"kubernetes.io/projected/a6a99658-d48c-41ab-8845-5c02440214aa-kube-api-access-7f26z\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233038 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.232972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-oauth-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233038 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.232994 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-service-ca\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233197 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-trusted-ca-bundle\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233197 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-oauth-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233197 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233693 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-service-ca\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233802 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233691 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-console-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.233965 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.233942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-oauth-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.234128 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.234100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6a99658-d48c-41ab-8845-5c02440214aa-trusted-ca-bundle\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.235787 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.235764 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-oauth-config\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.235857 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.235773 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6a99658-d48c-41ab-8845-5c02440214aa-console-serving-cert\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.241251 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.241232 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f26z\" (UniqueName: \"kubernetes.io/projected/a6a99658-d48c-41ab-8845-5c02440214aa-kube-api-access-7f26z\") pod \"console-5f7696b5f8-ggmss\" (UID: \"a6a99658-d48c-41ab-8845-5c02440214aa\") " pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.349867 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.349816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:25.470482 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:25.470459 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f7696b5f8-ggmss"] Apr 20 10:09:25.473105 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:09:25.473079 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a99658_d48c_41ab_8845_5c02440214aa.slice/crio-bae49f21dfaf2ec6b343f7435e89701fbedd960817c716aa5c905a8574ebe5c9 WatchSource:0}: Error finding container bae49f21dfaf2ec6b343f7435e89701fbedd960817c716aa5c905a8574ebe5c9: Status 404 returned error can't find the container with id bae49f21dfaf2ec6b343f7435e89701fbedd960817c716aa5c905a8574ebe5c9 Apr 20 10:09:26.285947 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:26.285912 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7696b5f8-ggmss" event={"ID":"a6a99658-d48c-41ab-8845-5c02440214aa","Type":"ContainerStarted","Data":"8b7bc6b35377a53579338000b13a45aada4d1ef35251a1f5d0ad19132b8a8201"} Apr 20 10:09:26.285947 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:26.285952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7696b5f8-ggmss" event={"ID":"a6a99658-d48c-41ab-8845-5c02440214aa","Type":"ContainerStarted","Data":"bae49f21dfaf2ec6b343f7435e89701fbedd960817c716aa5c905a8574ebe5c9"} Apr 20 10:09:26.307844 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:26.307780 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f7696b5f8-ggmss" podStartSLOduration=1.307765251 podStartE2EDuration="1.307765251s" podCreationTimestamp="2026-04-20 10:09:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:09:26.305807303 +0000 UTC m=+480.326653527" watchObservedRunningTime="2026-04-20 10:09:26.307765251 +0000 UTC m=+480.328611475" Apr 20 10:09:35.350373 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:35.350336 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:35.350373 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:35.350377 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:35.355284 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:35.355257 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:36.328533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:36.328507 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f7696b5f8-ggmss" Apr 20 10:09:36.383074 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:09:36.383044 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:10:01.402268 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.402206 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74d95f456c-v4fcc" podUID="84146eb7-c871-4b19-be2a-5b9184c35fe5" containerName="console" containerID="cri-o://88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac" gracePeriod=15 Apr 20 10:10:01.639231 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.639211 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d95f456c-v4fcc_84146eb7-c871-4b19-be2a-5b9184c35fe5/console/0.log" Apr 20 10:10:01.639343 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.639278 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:10:01.704225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704159 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704225 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704200 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704238 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704268 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704368 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66xg9\" (UniqueName: \"kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704403 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704447 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704431 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config\") pod \"84146eb7-c871-4b19-be2a-5b9184c35fe5\" (UID: \"84146eb7-c871-4b19-be2a-5b9184c35fe5\") " Apr 20 10:10:01.704696 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704679 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca" (OuterVolumeSpecName: "service-ca") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:10:01.704759 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704749 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-service-ca\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.704906 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704878 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:10:01.704906 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704870 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:10:01.705066 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.704995 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config" (OuterVolumeSpecName: "console-config") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 10:10:01.706542 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.706519 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:10:01.706641 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.706618 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9" (OuterVolumeSpecName: "kube-api-access-66xg9") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "kube-api-access-66xg9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 10:10:01.706683 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.706632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "84146eb7-c871-4b19-be2a-5b9184c35fe5" (UID: "84146eb7-c871-4b19-be2a-5b9184c35fe5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 10:10:01.805463 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805432 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.805463 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805462 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-trusted-ca-bundle\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.805463 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805472 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-oauth-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.805618 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805481 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66xg9\" (UniqueName: \"kubernetes.io/projected/84146eb7-c871-4b19-be2a-5b9184c35fe5-kube-api-access-66xg9\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.805618 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805492 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-oauth-serving-cert\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:01.805618 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:01.805502 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/84146eb7-c871-4b19-be2a-5b9184c35fe5-console-config\") on node \"ip-10-0-138-148.ec2.internal\" DevicePath \"\"" Apr 20 10:10:02.428454 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d95f456c-v4fcc_84146eb7-c871-4b19-be2a-5b9184c35fe5/console/0.log" Apr 20 10:10:02.428804 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428473 2577 generic.go:358] "Generic (PLEG): container finished" podID="84146eb7-c871-4b19-be2a-5b9184c35fe5" containerID="88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac" exitCode=2 Apr 20 10:10:02.428804 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d95f456c-v4fcc" event={"ID":"84146eb7-c871-4b19-be2a-5b9184c35fe5","Type":"ContainerDied","Data":"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac"} Apr 20 10:10:02.428804 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428561 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d95f456c-v4fcc" Apr 20 10:10:02.428804 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428578 2577 scope.go:117] "RemoveContainer" containerID="88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac" Apr 20 10:10:02.428804 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.428564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d95f456c-v4fcc" event={"ID":"84146eb7-c871-4b19-be2a-5b9184c35fe5","Type":"ContainerDied","Data":"943d5f196acbdf9e6189cbae73e3bb697eac8fdcbb487f1eb66e8946ecd47bd8"} Apr 20 10:10:02.438533 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.438514 2577 scope.go:117] "RemoveContainer" containerID="88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac" Apr 20 10:10:02.438838 ip-10-0-138-148 kubenswrapper[2577]: E0420 10:10:02.438819 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac\": container with ID starting with 88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac not found: ID does not exist" containerID="88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac" Apr 20 10:10:02.438885 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.438851 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac"} err="failed to get container status \"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac\": rpc error: code = NotFound desc = could not find container \"88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac\": container with ID starting with 88cf147b35cdd0760ecc0aa078f25ee2fc49b328feda6795e8ec1a19399803ac not found: ID does not exist" Apr 20 10:10:02.450829 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.450799 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:10:02.454613 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.454587 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74d95f456c-v4fcc"] Apr 20 10:10:02.539298 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:10:02.539262 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84146eb7-c871-4b19-be2a-5b9184c35fe5" path="/var/lib/kubelet/pods/84146eb7-c871-4b19-be2a-5b9184c35fe5/volumes" Apr 20 10:11:26.460684 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:11:26.460649 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:11:26.461210 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:11:26.461186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:16:26.485983 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:16:26.485898 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:16:26.486861 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:16:26.486845 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:21:26.510988 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:21:26.510958 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:21:26.514941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:21:26.514924 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:26:26.538410 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:26:26.538290 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:26:26.543689 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:26:26.543669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:31:26.565532 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:31:26.565434 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:31:26.575451 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:31:26.575430 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:36:26.588758 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:36:26.588663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:36:26.600113 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:36:26.600096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:37:24.225340 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:24.225237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-sdz58_39b2e908-eabf-4b4f-ae1d-5e46c3d244cf/global-pull-secret-syncer/0.log" Apr 20 10:37:24.343106 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:24.343074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-78fm5_a9613cd5-debf-4732-aa68-673d25ca0a6a/konnectivity-agent/0.log" Apr 20 10:37:24.427433 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:24.427399 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-148.ec2.internal_e3678d8120bbc28bfdc9a8f678b7b7df/haproxy/0.log" Apr 20 10:37:27.281143 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.281110 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/alertmanager/0.log" Apr 20 10:37:27.306845 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.306820 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/config-reloader/0.log" Apr 20 10:37:27.331807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.331785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/kube-rbac-proxy-web/0.log" Apr 20 10:37:27.354872 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.354855 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/kube-rbac-proxy/0.log" Apr 20 10:37:27.385651 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.385626 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/kube-rbac-proxy-metric/0.log" Apr 20 10:37:27.411526 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.411504 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/prom-label-proxy/0.log" Apr 20 10:37:27.435608 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.435585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f75c8fa0-09a4-4c62-aa67-d9762dbea003/init-config-reloader/0.log" Apr 20 10:37:27.503768 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.503749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ffbl4_b0b1003b-e990-4b70-bfb9-08b2cd905f97/kube-state-metrics/0.log" Apr 20 10:37:27.552330 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.552252 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ffbl4_b0b1003b-e990-4b70-bfb9-08b2cd905f97/kube-rbac-proxy-main/0.log" Apr 20 10:37:27.583687 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.583668 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-ffbl4_b0b1003b-e990-4b70-bfb9-08b2cd905f97/kube-rbac-proxy-self/0.log" Apr 20 10:37:27.615668 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.615653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-668b98d794-gx58x_1c0820ec-79be-4122-9bc9-af1459969f09/metrics-server/0.log" Apr 20 10:37:27.648410 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.648378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-w8jls_7bb9843a-089e-47f2-91c3-e7ca46156663/monitoring-plugin/0.log" Apr 20 10:37:27.759258 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.759238 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpgzh_c7ca1e44-73f9-4982-9492-529ac3ad8e18/node-exporter/0.log" Apr 20 10:37:27.785729 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.785709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpgzh_c7ca1e44-73f9-4982-9492-529ac3ad8e18/kube-rbac-proxy/0.log" Apr 20 10:37:27.813727 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.813680 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jpgzh_c7ca1e44-73f9-4982-9492-529ac3ad8e18/init-textfile/0.log" Apr 20 10:37:27.920418 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.920396 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vq444_67c5326b-25f5-40b4-9fb5-76cb61e11800/kube-rbac-proxy-main/0.log" Apr 20 10:37:27.963803 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.963781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vq444_67c5326b-25f5-40b4-9fb5-76cb61e11800/kube-rbac-proxy-self/0.log" Apr 20 10:37:27.992121 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:27.992100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-vq444_67c5326b-25f5-40b4-9fb5-76cb61e11800/openshift-state-metrics/0.log" Apr 20 10:37:28.230844 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.230796 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vt54g_45c27faf-167f-454a-9c28-7d7fa2f034a7/prometheus-operator/0.log" Apr 20 10:37:28.250106 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.250090 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-vt54g_45c27faf-167f-454a-9c28-7d7fa2f034a7/kube-rbac-proxy/0.log" Apr 20 10:37:28.276662 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.276642 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-zdb9c_dab700b5-5990-4d1f-882a-c2eef84a1305/prometheus-operator-admission-webhook/0.log" Apr 20 10:37:28.303750 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.303732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-687d956bd-rcphn_00471395-7962-401e-82db-0a745547e86a/telemeter-client/0.log" Apr 20 10:37:28.327785 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.327758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-687d956bd-rcphn_00471395-7962-401e-82db-0a745547e86a/reload/0.log" Apr 20 10:37:28.349820 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.349799 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-687d956bd-rcphn_00471395-7962-401e-82db-0a745547e86a/kube-rbac-proxy/0.log" Apr 20 10:37:28.377287 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.377264 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/thanos-query/0.log" Apr 20 10:37:28.402479 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.402462 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/kube-rbac-proxy-web/0.log" Apr 20 10:37:28.433575 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.433556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/kube-rbac-proxy/0.log" Apr 20 10:37:28.459514 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.459499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/prom-label-proxy/0.log" Apr 20 10:37:28.482782 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.482732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/kube-rbac-proxy-rules/0.log" Apr 20 10:37:28.505661 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:28.505647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-d5f699f68-zgg4r_db351594-bd1d-4f30-a5a3-d8f3e7f8864b/kube-rbac-proxy-metrics/0.log" Apr 20 10:37:30.857499 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.857465 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m"] Apr 20 10:37:30.857867 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.857828 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84146eb7-c871-4b19-be2a-5b9184c35fe5" containerName="console" Apr 20 10:37:30.857867 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.857840 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="84146eb7-c871-4b19-be2a-5b9184c35fe5" containerName="console" Apr 20 10:37:30.857941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.857903 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="84146eb7-c871-4b19-be2a-5b9184c35fe5" containerName="console" Apr 20 10:37:30.860931 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.860913 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:30.867633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.867477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"openshift-service-ca.crt\"" Apr 20 10:37:30.867633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.867477 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kqj2z\"/\"kube-root-ca.crt\"" Apr 20 10:37:30.867633 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.867499 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kqj2z\"/\"default-dockercfg-zn9bs\"" Apr 20 10:37:30.884578 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.884557 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m"] Apr 20 10:37:30.963375 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.963354 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f7696b5f8-ggmss_a6a99658-d48c-41ab-8845-5c02440214aa/console/0.log" Apr 20 10:37:30.990760 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.990738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-proc\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:30.990846 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.990785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-podres\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:30.990846 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.990809 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-sys\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:30.990952 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.990853 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctqzj\" (UniqueName: \"kubernetes.io/projected/0c75c758-7d1f-4545-a2b7-cd179503e216-kube-api-access-ctqzj\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:30.990952 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:30.990877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-lib-modules\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.039674 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.039653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rfsz8_d6698a90-789d-4e6d-89f3-76b2282e07f7/download-server/0.log" Apr 20 10:37:31.091561 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-podres\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091667 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-sys\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091667 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctqzj\" (UniqueName: \"kubernetes.io/projected/0c75c758-7d1f-4545-a2b7-cd179503e216-kube-api-access-ctqzj\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091667 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-lib-modules\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-proc\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-sys\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-podres\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-proc\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.091807 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.091786 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0c75c758-7d1f-4545-a2b7-cd179503e216-lib-modules\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.104165 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.104143 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctqzj\" (UniqueName: \"kubernetes.io/projected/0c75c758-7d1f-4545-a2b7-cd179503e216-kube-api-access-ctqzj\") pod \"perf-node-gather-daemonset-jbx9m\" (UID: \"0c75c758-7d1f-4545-a2b7-cd179503e216\") " pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.171107 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.171053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:31.294157 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.294133 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m"] Apr 20 10:37:31.295166 ip-10-0-138-148 kubenswrapper[2577]: W0420 10:37:31.295139 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0c75c758_7d1f_4545_a2b7_cd179503e216.slice/crio-dd865c8e3539168e44dfc6b209933add2ab4fcdfe0aa23430cfcb69d94760203 WatchSource:0}: Error finding container dd865c8e3539168e44dfc6b209933add2ab4fcdfe0aa23430cfcb69d94760203: Status 404 returned error can't find the container with id dd865c8e3539168e44dfc6b209933add2ab4fcdfe0aa23430cfcb69d94760203 Apr 20 10:37:31.296853 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:31.296831 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 10:37:32.093172 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.093144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" event={"ID":"0c75c758-7d1f-4545-a2b7-cd179503e216","Type":"ContainerStarted","Data":"c4b08d65aaf53eeae94d9aec71f5868e42db2590b2240b77bbe4e887a8285a19"} Apr 20 10:37:32.093535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.093179 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" event={"ID":"0c75c758-7d1f-4545-a2b7-cd179503e216","Type":"ContainerStarted","Data":"dd865c8e3539168e44dfc6b209933add2ab4fcdfe0aa23430cfcb69d94760203"} Apr 20 10:37:32.093535 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.093293 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:32.119335 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.119276 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" podStartSLOduration=2.119264602 podStartE2EDuration="2.119264602s" podCreationTimestamp="2026-04-20 10:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 10:37:32.11724736 +0000 UTC m=+2166.138093583" watchObservedRunningTime="2026-04-20 10:37:32.119264602 +0000 UTC m=+2166.140110853" Apr 20 10:37:32.455227 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.455161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gngqn_58441859-cdfd-435c-a17f-f77225ae4513/dns/0.log" Apr 20 10:37:32.510479 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.510453 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-gngqn_58441859-cdfd-435c-a17f-f77225ae4513/kube-rbac-proxy/0.log" Apr 20 10:37:32.690383 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:32.690361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qlvb7_081211a7-2d42-49fc-b457-6cded43e3390/dns-node-resolver/0.log" Apr 20 10:37:33.218281 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:33.218253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6b68f8c6f9-rrmkb_c58a395e-b75a-40a0-b474-87562006f6e1/registry/0.log" Apr 20 10:37:33.253485 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:33.253460 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4467h_34621a55-49e0-4ddf-85fb-fe957bb51987/node-ca/0.log" Apr 20 10:37:34.737595 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:34.737568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f6m7h_033bb25f-40eb-4a7f-ad48-e552fca86c6d/serve-healthcheck-canary/0.log" Apr 20 10:37:35.478921 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:35.478895 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-msdsc_5775bdcc-774b-426c-b355-50ab682f46eb/kube-rbac-proxy/0.log" Apr 20 10:37:35.500776 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:35.500753 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-msdsc_5775bdcc-774b-426c-b355-50ab682f46eb/exporter/0.log" Apr 20 10:37:35.522033 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:35.522006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-msdsc_5775bdcc-774b-426c-b355-50ab682f46eb/extractor/0.log" Apr 20 10:37:37.206855 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:37.206813 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-55bf455989-ts4gx_05fdcb4f-8174-49a8-8dc2-f59a0fdb22ae/manager/0.log" Apr 20 10:37:38.105815 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:38.105793 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kqj2z/perf-node-gather-daemonset-jbx9m" Apr 20 10:37:42.374247 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.374222 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/kube-multus-additional-cni-plugins/0.log" Apr 20 10:37:42.394718 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.394699 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/egress-router-binary-copy/0.log" Apr 20 10:37:42.414936 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.414913 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/cni-plugins/0.log" Apr 20 10:37:42.439462 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.439442 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/bond-cni-plugin/0.log" Apr 20 10:37:42.460856 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.460837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/routeoverride-cni/0.log" Apr 20 10:37:42.491941 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.491922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/whereabouts-cni-bincopy/0.log" Apr 20 10:37:42.516908 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.516888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j7mgk_f8dc4a94-baed-4ba8-8d13-6d45c52751d3/whereabouts-cni/0.log" Apr 20 10:37:42.633521 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.633471 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vq9gr_93bf46da-c530-40ee-bced-9d0772cc84b7/kube-multus/0.log" Apr 20 10:37:42.652737 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.652717 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4htjt_aa479de0-842b-41a8-952f-4382abbdf250/network-metrics-daemon/0.log" Apr 20 10:37:42.671883 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:42.671867 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4htjt_aa479de0-842b-41a8-952f-4382abbdf250/kube-rbac-proxy/0.log" Apr 20 10:37:44.195341 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.195292 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-controller/0.log" Apr 20 10:37:44.232048 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.232013 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:37:44.249995 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.249973 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/1.log" Apr 20 10:37:44.294817 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.294792 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/kube-rbac-proxy-node/0.log" Apr 20 10:37:44.321803 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.321781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 10:37:44.345783 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.345761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/northd/0.log" Apr 20 10:37:44.371299 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.371260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/nbdb/0.log" Apr 20 10:37:44.396671 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.396648 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/sbdb/0.log" Apr 20 10:37:44.550742 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:44.550721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovnkube-controller/0.log" Apr 20 10:37:45.608057 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:45.608032 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b258f_340db265-d04c-46d7-b5b0-6141dced7313/network-check-target-container/0.log" Apr 20 10:37:46.519203 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:46.519181 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-lcfcx_13714523-2ce0-41a5-92d0-6d74b6f94cba/iptables-alerter/0.log" Apr 20 10:37:47.286565 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:47.286542 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ldc77_f4ce2de7-ed89-497b-b795-9aa124b05d1c/tuned/0.log" Apr 20 10:37:50.976202 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:50.976175 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-5xc8h_9e0bf27c-dbd8-4449-b3f8-9a2e7c0ac56b/service-ca-controller/0.log" Apr 20 10:37:51.728215 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:51.728186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9ttl2_1f12cab1-38db-4199-ab17-ed83ce13c27d/csi-driver/0.log" Apr 20 10:37:51.789239 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:51.789214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9ttl2_1f12cab1-38db-4199-ab17-ed83ce13c27d/csi-node-driver-registrar/0.log" Apr 20 10:37:51.837858 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:37:51.837840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9ttl2_1f12cab1-38db-4199-ab17-ed83ce13c27d/csi-liveness-probe/0.log" Apr 20 10:41:26.612086 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:41:26.612000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log" Apr 20 10:41:26.625040 ip-10-0-138-148 kubenswrapper[2577]: I0420 10:41:26.625019 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwdv7_55296336-b343-4c13-ad2f-c3ceff32fcfe/ovn-acl-logging/0.log"