Apr 23 13:32:13.327811 ip-10-0-137-177 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:32:13.793556 ip-10-0-137-177 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:13.793556 ip-10-0-137-177 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:32:13.793556 ip-10-0-137-177 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:13.793556 ip-10-0-137-177 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:32:13.793556 ip-10-0-137-177 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:32:13.796695 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.796468 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:32:13.798931 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798916 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:13.798931 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798931 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798935 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798938 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798942 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798945 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798948 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798953 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798956 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798960 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798962 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798965 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798969 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798972 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798975 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798978 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798980 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798983 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798986 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798988 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:13.798997 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798991 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798994 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798996 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.798999 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799002 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799005 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799008 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799010 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799013 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799016 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799019 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799021 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799024 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799026 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799036 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799040 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799043 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799045 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799048 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799050 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:13.799453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799053 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799055 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799058 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799060 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799063 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799065 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799068 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799071 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799073 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799075 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799078 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799081 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799083 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799086 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799089 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799092 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799095 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799098 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799101 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799103 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:13.799956 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799106 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799109 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799111 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799115 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799117 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799120 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799123 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799125 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799128 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799131 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799133 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799135 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799138 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799140 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799143 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799145 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799148 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799150 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799153 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799155 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:13.800432 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799158 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799160 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799163 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799166 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799169 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799172 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799542 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799547 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799550 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799553 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799557 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799560 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799562 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799565 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799568 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799571 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799573 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799576 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799581 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799583 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:13.800918 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799586 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799589 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799591 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799594 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799597 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799599 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799602 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799605 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799607 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799610 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799613 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799615 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799618 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799620 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799623 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799626 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799629 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799632 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799635 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799638 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:13.801402 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799640 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799645 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799649 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799653 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799656 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799659 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799662 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799665 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799668 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799671 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799674 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799677 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799680 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799682 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799685 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799688 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799690 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799693 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799695 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:13.801984 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799698 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799700 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799703 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799705 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799708 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799710 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799713 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799715 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799718 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799720 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799724 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799728 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799731 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799734 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799737 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799741 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799743 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799746 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799748 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799751 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:13.802447 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799753 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799756 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799759 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799761 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799764 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799772 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799775 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799778 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799781 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799783 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799786 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799789 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.799791 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801233 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801241 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801248 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801253 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801257 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801261 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801265 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801270 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:32:13.802952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801273 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801276 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801281 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801284 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801288 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801291 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801294 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801296 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801300 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801302 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801305 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801310 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801313 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801316 2576 flags.go:64] FLAG: --config-dir="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801319 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801322 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801327 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801330 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801333 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801337 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801340 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801344 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801347 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801350 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801353 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:32:13.803459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801358 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801361 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801363 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801366 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801369 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801372 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801378 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801381 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801384 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801387 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801390 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801394 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801397 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801400 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801403 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801406 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801409 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801413 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801416 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801419 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801421 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801424 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801428 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801431 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801434 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:32:13.804087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801438 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801441 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801444 2576 flags.go:64] FLAG: --help="false" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801447 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-177.ec2.internal" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801450 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801453 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801456 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801459 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801462 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801465 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801468 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801471 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801474 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801477 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801480 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801483 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801487 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801490 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801493 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801496 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801499 2576 flags.go:64] FLAG: --lock-file="" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801512 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801516 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801519 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:32:13.804688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801524 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801527 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801530 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801533 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801536 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801539 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801542 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801546 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801550 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801553 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801558 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801561 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801564 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801567 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801570 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801573 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801576 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801579 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801587 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801590 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801593 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801596 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801599 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:32:13.805295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801605 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801609 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801612 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801615 2576 flags.go:64] FLAG: --port="10250" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801619 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801622 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08c7b38bf2da51d09" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801625 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801628 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801631 2576 flags.go:64] FLAG: --register-node="true" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801634 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801636 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801640 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801643 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801646 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801649 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801653 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801656 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801659 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801662 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801665 2576 flags.go:64] FLAG: --runonce="false" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801668 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801671 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801674 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801677 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801680 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801683 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:32:13.805857 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801686 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801689 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801692 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801695 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801698 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801701 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801704 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801709 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801713 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801718 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801721 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801724 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801728 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801731 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801734 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801737 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801740 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801743 2576 flags.go:64] FLAG: --v="2" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801747 2576 flags.go:64] FLAG: --version="false" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801752 2576 flags.go:64] FLAG: --vmodule="" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801756 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.801759 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801857 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801862 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:13.806466 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801865 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801868 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801871 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801874 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801876 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801879 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801882 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801885 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801887 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801890 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801892 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801895 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801898 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801901 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801903 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801907 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801910 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801913 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801915 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801918 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:13.807098 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801920 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801923 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801925 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801927 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801930 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801933 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801935 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801938 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801940 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801943 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801945 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801948 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801950 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801953 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.801955 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802628 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802634 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802639 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802642 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802645 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:13.807646 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802648 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802651 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802654 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802657 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802660 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802663 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802666 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802669 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802672 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802675 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802678 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802681 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802684 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802687 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802689 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802692 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802694 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802697 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802699 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802702 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:13.808190 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802705 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802707 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802709 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802712 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802714 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802717 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802719 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802722 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802725 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802727 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802730 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802733 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802735 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802738 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802741 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802743 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802746 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802748 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802752 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:13.808763 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802756 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:13.809297 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802759 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:13.809297 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802762 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:13.809297 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802764 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:13.809297 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.802767 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:13.809297 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.802775 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:13.809498 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.809481 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:32:13.809498 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.809498 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809565 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809570 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809573 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809576 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809579 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809582 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:13.809581 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809585 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809589 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809591 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809594 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809597 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809599 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809602 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809611 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809614 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809616 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809619 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809622 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809624 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809627 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809630 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809632 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809635 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809638 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809641 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809643 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:13.809784 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809646 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809648 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809651 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809653 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809656 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809658 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809661 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809663 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809666 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809669 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809671 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809674 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809676 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809679 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809682 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809685 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809687 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809690 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809693 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809695 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:13.810345 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809703 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809706 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809708 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809711 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809713 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809716 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809718 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809721 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809723 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809726 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809728 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809731 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809733 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809736 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809739 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809741 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809744 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809746 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809748 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809751 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:13.810946 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809754 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809756 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809759 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809762 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809764 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809767 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809775 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809780 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809784 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809787 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809790 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809793 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809797 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809807 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809812 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809815 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809818 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809820 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809823 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:13.811522 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809825 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.809830 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809979 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809985 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809989 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809992 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809995 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.809998 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810001 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810004 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810006 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810009 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810012 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810014 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810017 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:32:13.812042 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810019 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810022 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810025 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810027 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810030 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810033 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810036 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810038 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810041 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810044 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810048 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810050 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810058 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810061 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810063 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810066 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810069 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810071 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810074 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810076 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:32:13.812429 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810079 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810081 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810083 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810086 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810088 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810091 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810094 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810097 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810099 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810102 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810104 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810107 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810109 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810112 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810114 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810117 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810119 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810122 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810125 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:32:13.812940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810127 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810130 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810132 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810135 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810137 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810140 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810148 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810151 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810153 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810156 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810158 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810161 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810163 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810166 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810168 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810171 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810173 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810175 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810178 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810180 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:32:13.813410 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810183 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810186 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810188 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810191 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810193 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810196 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810198 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810200 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810203 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810206 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810208 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810211 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810213 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:13.810216 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.810221 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:32:13.813921 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.810977 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:32:13.814279 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.812958 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:32:13.814279 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.813909 2576 server.go:1019] "Starting client certificate rotation" Apr 23 13:32:13.814279 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.814004 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:13.814892 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.814880 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:32:13.841092 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.841073 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:13.848145 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.848118 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:32:13.866592 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.866574 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:32:13.871701 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.871681 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:13.872633 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.872620 2576 log.go:25] "Validated CRI v1 image API" Apr 23 13:32:13.874384 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.874366 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:32:13.877657 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.877631 2576 fs.go:135] Filesystem UUIDs: map[51a8c96e-8804-4a71-adb6-2af31b561dc4:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b4bed54f-e7f7-4ae2-ba46-3d95d087e68c:/dev/nvme0n1p3] Apr 23 13:32:13.877751 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.877654 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:32:13.884029 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.883912 2576 manager.go:217] Machine: {Timestamp:2026-04-23 13:32:13.881913105 +0000 UTC m=+0.425657312 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104667 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e4916b76498bfebfa0b6d2050473e SystemUUID:ec2e4916-b764-98bf-ebfa-0b6d2050473e BootID:b6c4c27c-fc39-40a2-a467-82e9bb9e6866 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:55:69:ae:1d:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:55:69:ae:1d:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:c7:77:b1:99:e9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:32:13.884029 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.884022 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:32:13.884168 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.884150 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:32:13.885942 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.885916 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:32:13.886102 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.885944 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-177.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:32:13.886144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.886109 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:32:13.886144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.886118 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:32:13.886144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.886130 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:13.887061 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.887050 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:32:13.888282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.888267 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pp8wj" Apr 23 13:32:13.888523 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.888499 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:13.888641 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.888633 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:32:13.892032 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.892022 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:32:13.892080 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.892036 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:32:13.892080 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.892047 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:32:13.892080 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.892062 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:32:13.892080 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.892077 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:32:13.893302 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.893285 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:13.893302 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.893305 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:32:13.895249 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.895232 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pp8wj" Apr 23 13:32:13.896182 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.896161 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:32:13.899968 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.899946 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:32:13.901779 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901763 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901788 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901799 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901809 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901818 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901827 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901837 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:32:13.901845 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901845 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:32:13.902036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901855 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:32:13.902036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901865 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:32:13.902036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901878 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:32:13.902036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.901891 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:32:13.903909 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.903896 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:32:13.903947 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.903912 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:32:13.906348 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.906329 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.908078 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.908065 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:32:13.908132 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.908103 2576 server.go:1295] "Started kubelet" Apr 23 13:32:13.908229 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.908201 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:32:13.908278 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.908195 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:32:13.908278 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.908260 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:32:13.909064 ip-10-0-137-177 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:32:13.909654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.909599 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.910221 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.910204 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:32:13.910814 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.910796 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-177.ec2.internal" not found Apr 23 13:32:13.911121 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.911106 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:32:13.915309 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.915293 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:13.915843 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.915820 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:32:13.916557 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916542 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:32:13.916675 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916658 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:32:13.916765 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916603 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:32:13.916765 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916719 2576 factory.go:55] Registering systemd factory Apr 23 13:32:13.916860 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916773 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:32:13.916860 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916783 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:32:13.916860 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.916782 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:32:13.916997 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.916847 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-177.ec2.internal\" not found" Apr 23 13:32:13.917150 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917131 2576 factory.go:153] Registering CRI-O factory Apr 23 13:32:13.917150 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.917136 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:32:13.917150 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917151 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 13:32:13.917339 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917210 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:32:13.917339 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917238 2576 factory.go:103] Registering Raw factory Apr 23 13:32:13.917339 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917255 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 13:32:13.917699 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917687 2576 manager.go:319] Starting recovery of all containers Apr 23 13:32:13.917826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.917811 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:13.920130 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.920097 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-177.ec2.internal\" not found" node="ip-10-0-137-177.ec2.internal" Apr 23 13:32:13.924964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.924838 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:32:13.926632 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.926607 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-177.ec2.internal" not found Apr 23 13:32:13.927707 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.927687 2576 manager.go:324] Recovery completed Apr 23 13:32:13.933475 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.933461 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:13.935458 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.935444 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:13.935522 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.935473 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:13.935522 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.935484 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:13.936039 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.936024 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:32:13.936039 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.936037 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:32:13.936151 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.936053 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:32:13.938602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.938591 2576 policy_none.go:49] "None policy: Start" Apr 23 13:32:13.938646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.938607 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:32:13.938646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.938617 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:32:13.979147 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979132 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.979199 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979214 2576 server.go:85] "Starting device plugin registration server" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979450 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979464 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979564 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979646 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.979655 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.980359 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:13.980401 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-177.ec2.internal\" not found" Apr 23 13:32:13.994654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:13.982808 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-177.ec2.internal" not found Apr 23 13:32:14.056269 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.056186 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:32:14.056269 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.056223 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:32:14.056269 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.056242 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:32:14.056269 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.056248 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:32:14.056482 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:14.056335 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:32:14.059613 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.059589 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:14.080206 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.080178 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:32:14.081259 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.081240 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:32:14.081357 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.081270 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:32:14.081357 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.081281 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:32:14.081357 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.081305 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.088349 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.088331 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.088424 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:14.088353 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-177.ec2.internal\": node \"ip-10-0-137-177.ec2.internal\" not found" Apr 23 13:32:14.157220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.157164 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal"] Apr 23 13:32:14.159641 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.159622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.159739 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.159621 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.182380 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.182360 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.186970 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.186953 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.193754 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.193739 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:14.199718 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.199705 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:32:14.318650 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.318574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06e1c4c209f8543cc577f01ef69cf08e-config\") pod \"kube-apiserver-proxy-ip-10-0-137-177.ec2.internal\" (UID: \"06e1c4c209f8543cc577f01ef69cf08e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.318650 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.318604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.318650 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.318623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419345 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419321 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419445 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419445 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06e1c4c209f8543cc577f01ef69cf08e-config\") pod \"kube-apiserver-proxy-ip-10-0-137-177.ec2.internal\" (UID: \"06e1c4c209f8543cc577f01ef69cf08e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419445 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419407 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419445 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e44658396063ef3d364eee1cf1f44e80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal\" (UID: \"e44658396063ef3d364eee1cf1f44e80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.419586 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.419450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/06e1c4c209f8543cc577f01ef69cf08e-config\") pod \"kube-apiserver-proxy-ip-10-0-137-177.ec2.internal\" (UID: \"06e1c4c209f8543cc577f01ef69cf08e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.496971 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.496938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.502550 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.502528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" Apr 23 13:32:14.813910 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.813825 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:32:14.814588 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.814063 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:14.814588 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.814067 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:14.814588 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.814063 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:32:14.892435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.892403 2576 apiserver.go:52] "Watching apiserver" Apr 23 13:32:14.897457 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.897429 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:27:13 +0000 UTC" deadline="2027-12-29 22:42:26.893763226 +0000 UTC" Apr 23 13:32:14.897457 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.897457 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14769h10m11.996308803s" Apr 23 13:32:14.899400 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.899379 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:32:14.899897 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.899875 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal","openshift-multus/multus-7qqkv","openshift-multus/network-metrics-daemon-wzp5m","openshift-network-diagnostics/network-check-target-fnd8j","openshift-ovn-kubernetes/ovnkube-node-lhgvj","kube-system/konnectivity-agent-dpm52","kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal","openshift-cluster-node-tuning-operator/tuned-mcpv6","openshift-dns/node-resolver-v6kpm","openshift-multus/multus-additional-cni-plugins-hhztr","openshift-network-operator/iptables-alerter-7jsd5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9","openshift-image-registry/node-ca-n7dgj"] Apr 23 13:32:14.904339 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.904291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.906060 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.906030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:14.906180 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:14.906151 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:14.907851 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.907836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:14.907946 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:14.907931 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:14.909239 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.909223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.909954 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.909934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.910052 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.909956 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5r6gd\"" Apr 23 13:32:14.910052 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.910037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:32:14.910238 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.910223 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:32:14.910442 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.910424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.910703 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.910680 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:14.913991 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.913974 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.914128 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.914097 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:32:14.914320 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.914302 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.915131 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915109 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:32:14.915311 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915271 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bqtrf\"" Apr 23 13:32:14.915414 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915371 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:32:14.915477 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915413 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:32:14.915903 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915692 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:32:14.915903 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915705 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.915903 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915761 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7vkbp\"" Apr 23 13:32:14.915903 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.915776 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:32:14.916191 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.916004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:32:14.916747 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.916729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnpgw\"" Apr 23 13:32:14.916898 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.916820 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.916898 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.916860 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:14.917266 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.917249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.919223 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.919206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.919435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.919389 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.919875 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.919652 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wk8bw\"" Apr 23 13:32:14.919957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.919910 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.920955 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.920938 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:14.921666 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.921630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:32:14.922050 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922018 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:32:14.922050 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2dsfl\"" Apr 23 13:32:14.922483 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:14.922826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.922928 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5762a240-1436-4c52-bead-2abd75c01895-agent-certs\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:14.922928 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922847 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5762a240-1436-4c52-bead-2abd75c01895-konnectivity-ca\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:14.922928 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.922928 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-conf-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.922928 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-etc-kubernetes\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-systemd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.922987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-bin\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923017 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-tmp\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-conf\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-var-lib-kubelet\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923121 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-multus\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-hostroot\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc7j\" (UniqueName: \"kubernetes.io/projected/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-kube-api-access-tpc7j\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923211 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-netd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzfwx\" (UniqueName: \"kubernetes.io/projected/8b934c24-9a04-47cb-a0a9-ce2109c8b735-kube-api-access-vzfwx\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-systemd\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-node-log\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e662fef7-fd2a-4a55-91de-e3ed361dab06-hosts-file\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:14.923425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923402 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkmrc\" (UniqueName: \"kubernetes.io/projected/fde80200-8a4e-4844-91f0-ed8f18a92617-kube-api-access-vkmrc\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysconfig\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-sys\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-system-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cni-binary-copy\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-k8s-cni-cncf-io\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-bin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-log-socket\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-lib-modules\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-socket-dir-parent\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-config\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-script-lib\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-system-cni-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cnibin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-kubelet\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-host\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-os-release\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.923869 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-os-release\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-env-overrides\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovn-node-metrics-cert\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz49g\" (UniqueName: \"kubernetes.io/projected/047f80fa-7458-4de1-b0e4-f52fea4fbe72-kube-api-access-gz49g\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e662fef7-fd2a-4a55-91de-e3ed361dab06-tmp-dir\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwzc6\" (UniqueName: \"kubernetes.io/projected/e662fef7-fd2a-4a55-91de-e3ed361dab06-kube-api-access-nwzc6\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.923999 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-multus-certs\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-slash\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-etc-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gvz\" (UniqueName: \"kubernetes.io/projected/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-kube-api-access-z8gvz\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924132 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-daemon-config\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924202 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-systemd-units\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924244 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:32:14.924583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cnibin\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-kubernetes\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924359 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-kubelet\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-run\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-var-lib-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-tuned\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-netns\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-netns\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-modprobe-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-ovn\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:14.925088 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.924855 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ft7qb\"" Apr 23 13:32:14.925564 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.925134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:32:14.925564 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.925319 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.925564 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.925452 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.928703 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.928610 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:32:14.928703 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.928614 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:32:14.928703 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.928680 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:32:14.928891 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.928680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-svnt7\"" Apr 23 13:32:14.928891 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.928788 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-b6qz7\"" Apr 23 13:32:14.931884 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.931862 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:32:14.951812 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.951670 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dlxhh" Apr 23 13:32:14.958387 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:14.958370 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dlxhh" Apr 23 13:32:15.017517 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.017487 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:32:15.025015 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.024991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-ovn\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgb9\" (UniqueName: \"kubernetes.io/projected/e2865e96-7bac-4087-bb1b-0cf266b4deb0-kube-api-access-drgb9\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.025130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.025130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5762a240-1436-4c52-bead-2abd75c01895-agent-certs\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:15.025130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5762a240-1436-4c52-bead-2abd75c01895-konnectivity-ca\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:15.025130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025103 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-ovn\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-conf-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-etc-kubernetes\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-systemd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-bin\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-conf-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-tmp\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-conf\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025367 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-var-lib-kubelet\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-multus\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-hostroot\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-etc-kubernetes\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc7j\" (UniqueName: \"kubernetes.io/projected/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-kube-api-access-tpc7j\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-netd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-systemd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-bin\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-sys-fs\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-multus\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzfwx\" (UniqueName: \"kubernetes.io/projected/8b934c24-9a04-47cb-a0a9-ce2109c8b735-kube-api-access-vzfwx\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-systemd\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025697 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-node-log\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.025803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5762a240-1436-4c52-bead-2abd75c01895-konnectivity-ca\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025741 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-node-log\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025743 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-hostroot\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysctl-conf\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-cni-netd\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-var-lib-kubelet\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.025738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-socket-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-systemd\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e662fef7-fd2a-4a55-91de-e3ed361dab06-hosts-file\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-run-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-device-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de702b67-cf80-4b1f-b30b-e4a459ac038e-host\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e662fef7-fd2a-4a55-91de-e3ed361dab06-hosts-file\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkmrc\" (UniqueName: \"kubernetes.io/projected/fde80200-8a4e-4844-91f0-ed8f18a92617-kube-api-access-vkmrc\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.026575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysconfig\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026217 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-sys\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-system-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cni-binary-copy\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-sysconfig\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-k8s-cni-cncf-io\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-bin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-log-socket\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-lib-modules\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-socket-dir-parent\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-config\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-script-lib\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-log-socket\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-system-cni-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-lib-modules\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-k8s-cni-cncf-io\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-cni-bin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.027388 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-system-cni-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cnibin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-sys\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-kubelet\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-var-lib-kubelet\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cnibin\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-host\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-socket-dir-parent\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-os-release\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-host\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-cni-binary-copy\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-os-release\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-os-release\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-env-overrides\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-system-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovn-node-metrics-cert\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-os-release\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz49g\" (UniqueName: \"kubernetes.io/projected/047f80fa-7458-4de1-b0e4-f52fea4fbe72-kube-api-access-gz49g\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.028220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.026997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e662fef7-fd2a-4a55-91de-e3ed361dab06-tmp-dir\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de702b67-cf80-4b1f-b30b-e4a459ac038e-serviceca\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x85gd\" (UniqueName: \"kubernetes.io/projected/de702b67-cf80-4b1f-b30b-e4a459ac038e-kube-api-access-x85gd\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwzc6\" (UniqueName: \"kubernetes.io/projected/e662fef7-fd2a-4a55-91de-e3ed361dab06-kube-api-access-nwzc6\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-script-lib\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-cni-dir\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e662fef7-fd2a-4a55-91de-e3ed361dab06-tmp-dir\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-multus-certs\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027430 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-slash\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-etc-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-slash\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gvz\" (UniqueName: \"kubernetes.io/projected/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-kube-api-access-z8gvz\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027559 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-daemon-config\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-systemd-units\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2865e96-7bac-4087-bb1b-0cf266b4deb0-host-slash\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cnibin\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-kubernetes\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-kubelet\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2865e96-7bac-4087-bb1b-0cf266b4deb0-iptables-alerter-script\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovnkube-config\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-etc-selinux\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-run\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-var-lib-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-registration-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-tuned\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-netns\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-netns\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh4mq\" (UniqueName: \"kubernetes.io/projected/5941c699-21b4-4722-baea-e35ca0811594-kube-api-access-hh4mq\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:15.029791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028021 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-modprobe-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cni-binary-copy\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-modprobe-d\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028156 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-etc-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.027448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-multus-certs\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-systemd-units\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b934c24-9a04-47cb-a0a9-ce2109c8b735-cnibin\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-kubernetes\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-multus-daemon-config\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-host-run-netns\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028737 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-run-netns\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.028772 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-host-kubelet\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.028840 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.528821378 +0000 UTC m=+2.072565575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-var-lib-openvswitch\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.030416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.028885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/047f80fa-7458-4de1-b0e4-f52fea4fbe72-run\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.029216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-env-overrides\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.029607 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e1c4c209f8543cc577f01ef69cf08e.slice/crio-99356e37e7a973a38971984e7e26a37f0fcd9e4d29e403cd6a919d2cf75aaff6 WatchSource:0}: Error finding container 99356e37e7a973a38971984e7e26a37f0fcd9e4d29e403cd6a919d2cf75aaff6: Status 404 returned error can't find the container with id 99356e37e7a973a38971984e7e26a37f0fcd9e4d29e403cd6a919d2cf75aaff6 Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.029895 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode44658396063ef3d364eee1cf1f44e80.slice/crio-6800aeea8bbd77427c235518e2ea12bedf23070fbbe2240be4ac95aa2d680aa5 WatchSource:0}: Error finding container 6800aeea8bbd77427c235518e2ea12bedf23070fbbe2240be4ac95aa2d680aa5: Status 404 returned error can't find the container with id 6800aeea8bbd77427c235518e2ea12bedf23070fbbe2240be4ac95aa2d680aa5 Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.030648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-tmp\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.030882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5762a240-1436-4c52-bead-2abd75c01895-agent-certs\") pod \"konnectivity-agent-dpm52\" (UID: \"5762a240-1436-4c52-bead-2abd75c01895\") " pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:15.031167 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.030931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/047f80fa-7458-4de1-b0e4-f52fea4fbe72-etc-tuned\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.032556 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.032523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-ovn-node-metrics-cert\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.034315 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.034296 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:32:15.035010 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.034981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkmrc\" (UniqueName: \"kubernetes.io/projected/fde80200-8a4e-4844-91f0-ed8f18a92617-kube-api-access-vkmrc\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:15.035545 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.035466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzfwx\" (UniqueName: \"kubernetes.io/projected/8b934c24-9a04-47cb-a0a9-ce2109c8b735-kube-api-access-vzfwx\") pod \"multus-additional-cni-plugins-hhztr\" (UID: \"8b934c24-9a04-47cb-a0a9-ce2109c8b735\") " pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.035700 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.035673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwzc6\" (UniqueName: \"kubernetes.io/projected/e662fef7-fd2a-4a55-91de-e3ed361dab06-kube-api-access-nwzc6\") pod \"node-resolver-v6kpm\" (UID: \"e662fef7-fd2a-4a55-91de-e3ed361dab06\") " pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.035932 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.035910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc7j\" (UniqueName: \"kubernetes.io/projected/6584cca1-f6ed-4d94-8644-5eb9b59e13e6-kube-api-access-tpc7j\") pod \"multus-7qqkv\" (UID: \"6584cca1-f6ed-4d94-8644-5eb9b59e13e6\") " pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.038117 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.038099 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:15.038210 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.038121 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:15.038210 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.038134 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.038210 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.038200 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:15.538181187 +0000 UTC m=+2.081925395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.039351 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.039334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gvz\" (UniqueName: \"kubernetes.io/projected/2b98e81e-dc6f-4d15-b8ec-77a01c0ee951-kube-api-access-z8gvz\") pod \"ovnkube-node-lhgvj\" (UID: \"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951\") " pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.040062 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.040042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz49g\" (UniqueName: \"kubernetes.io/projected/047f80fa-7458-4de1-b0e4-f52fea4fbe72-kube-api-access-gz49g\") pod \"tuned-mcpv6\" (UID: \"047f80fa-7458-4de1-b0e4-f52fea4fbe72\") " pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.059682 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.059641 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" event={"ID":"06e1c4c209f8543cc577f01ef69cf08e","Type":"ContainerStarted","Data":"99356e37e7a973a38971984e7e26a37f0fcd9e4d29e403cd6a919d2cf75aaff6"} Apr 23 13:32:15.060560 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.060539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" event={"ID":"e44658396063ef3d364eee1cf1f44e80","Type":"ContainerStarted","Data":"6800aeea8bbd77427c235518e2ea12bedf23070fbbe2240be4ac95aa2d680aa5"} Apr 23 13:32:15.129295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drgb9\" (UniqueName: \"kubernetes.io/projected/e2865e96-7bac-4087-bb1b-0cf266b4deb0-kube-api-access-drgb9\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.129295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-sys-fs\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129315 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-socket-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-device-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-sys-fs\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de702b67-cf80-4b1f-b30b-e4a459ac038e-host\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129422 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de702b67-cf80-4b1f-b30b-e4a459ac038e-host\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-device-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de702b67-cf80-4b1f-b30b-e4a459ac038e-serviceca\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-socket-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.129585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x85gd\" (UniqueName: \"kubernetes.io/projected/de702b67-cf80-4b1f-b30b-e4a459ac038e-kube-api-access-x85gd\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2865e96-7bac-4087-bb1b-0cf266b4deb0-host-slash\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e2865e96-7bac-4087-bb1b-0cf266b4deb0-host-slash\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2865e96-7bac-4087-bb1b-0cf266b4deb0-iptables-alerter-script\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-etc-selinux\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-registration-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129775 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh4mq\" (UniqueName: \"kubernetes.io/projected/5941c699-21b4-4722-baea-e35ca0811594-kube-api-access-hh4mq\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-etc-selinux\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de702b67-cf80-4b1f-b30b-e4a459ac038e-serviceca\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.130048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.129851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5941c699-21b4-4722-baea-e35ca0811594-registration-dir\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.130399 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.130252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e2865e96-7bac-4087-bb1b-0cf266b4deb0-iptables-alerter-script\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.141239 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.141209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgb9\" (UniqueName: \"kubernetes.io/projected/e2865e96-7bac-4087-bb1b-0cf266b4deb0-kube-api-access-drgb9\") pod \"iptables-alerter-7jsd5\" (UID: \"e2865e96-7bac-4087-bb1b-0cf266b4deb0\") " pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.142046 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.142026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh4mq\" (UniqueName: \"kubernetes.io/projected/5941c699-21b4-4722-baea-e35ca0811594-kube-api-access-hh4mq\") pod \"aws-ebs-csi-driver-node-m7xq9\" (UID: \"5941c699-21b4-4722-baea-e35ca0811594\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.142192 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.142177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x85gd\" (UniqueName: \"kubernetes.io/projected/de702b67-cf80-4b1f-b30b-e4a459ac038e-kube-api-access-x85gd\") pod \"node-ca-n7dgj\" (UID: \"de702b67-cf80-4b1f-b30b-e4a459ac038e\") " pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.234414 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.234386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qqkv" Apr 23 13:32:15.240157 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.240130 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6584cca1_f6ed_4d94_8644_5eb9b59e13e6.slice/crio-3899c33c325596b17561e0d89a6d46f543ee4639752af7bcbd533483e17fc324 WatchSource:0}: Error finding container 3899c33c325596b17561e0d89a6d46f543ee4639752af7bcbd533483e17fc324: Status 404 returned error can't find the container with id 3899c33c325596b17561e0d89a6d46f543ee4639752af7bcbd533483e17fc324 Apr 23 13:32:15.259512 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.259486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:15.267172 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.267141 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b98e81e_dc6f_4d15_b8ec_77a01c0ee951.slice/crio-e74705716cf0e0bb309ac8f97bec59cc1fb23e06ba9f6b6cc080ca021fc60757 WatchSource:0}: Error finding container e74705716cf0e0bb309ac8f97bec59cc1fb23e06ba9f6b6cc080ca021fc60757: Status 404 returned error can't find the container with id e74705716cf0e0bb309ac8f97bec59cc1fb23e06ba9f6b6cc080ca021fc60757 Apr 23 13:32:15.271137 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.271117 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:15.276828 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.276806 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5762a240_1436_4c52_bead_2abd75c01895.slice/crio-b4d6513621782c3843ced390c3bb09092227a5a9cf82722a3d092067fb51f2e6 WatchSource:0}: Error finding container b4d6513621782c3843ced390c3bb09092227a5a9cf82722a3d092067fb51f2e6: Status 404 returned error can't find the container with id b4d6513621782c3843ced390c3bb09092227a5a9cf82722a3d092067fb51f2e6 Apr 23 13:32:15.286791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.286769 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" Apr 23 13:32:15.293154 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.293104 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod047f80fa_7458_4de1_b0e4_f52fea4fbe72.slice/crio-e17dc6f347a6954c83102941816dda31ad4c5f297b64ad57620eb614b326cf52 WatchSource:0}: Error finding container e17dc6f347a6954c83102941816dda31ad4c5f297b64ad57620eb614b326cf52: Status 404 returned error can't find the container with id e17dc6f347a6954c83102941816dda31ad4c5f297b64ad57620eb614b326cf52 Apr 23 13:32:15.305284 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.305261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v6kpm" Apr 23 13:32:15.311221 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.311185 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode662fef7_fd2a_4a55_91de_e3ed361dab06.slice/crio-676611b6df1ff1e07dade57e0be3ed3e5ca39f96cce716bd0c0b3dedb6484efa WatchSource:0}: Error finding container 676611b6df1ff1e07dade57e0be3ed3e5ca39f96cce716bd0c0b3dedb6484efa: Status 404 returned error can't find the container with id 676611b6df1ff1e07dade57e0be3ed3e5ca39f96cce716bd0c0b3dedb6484efa Apr 23 13:32:15.324656 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.324638 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hhztr" Apr 23 13:32:15.331071 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.331049 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b934c24_9a04_47cb_a0a9_ce2109c8b735.slice/crio-bd632bd9147320140804d25ad25a0dd254cdfb7a3c87736966dbff3f808e6100 WatchSource:0}: Error finding container bd632bd9147320140804d25ad25a0dd254cdfb7a3c87736966dbff3f808e6100: Status 404 returned error can't find the container with id bd632bd9147320140804d25ad25a0dd254cdfb7a3c87736966dbff3f808e6100 Apr 23 13:32:15.336776 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.336759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7jsd5" Apr 23 13:32:15.342374 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.342354 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2865e96_7bac_4087_bb1b_0cf266b4deb0.slice/crio-77110bdaf8f5e61e61eb9f128b303384b3af04e349df9385a4d0636dbd6c17f1 WatchSource:0}: Error finding container 77110bdaf8f5e61e61eb9f128b303384b3af04e349df9385a4d0636dbd6c17f1: Status 404 returned error can't find the container with id 77110bdaf8f5e61e61eb9f128b303384b3af04e349df9385a4d0636dbd6c17f1 Apr 23 13:32:15.342558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.342413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" Apr 23 13:32:15.347154 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.347135 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n7dgj" Apr 23 13:32:15.349057 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.349039 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5941c699_21b4_4722_baea_e35ca0811594.slice/crio-2f7854a5a8d44b4f3f92e4da2e6eccba741386ebb1f5c5de0d2609385f85b5c7 WatchSource:0}: Error finding container 2f7854a5a8d44b4f3f92e4da2e6eccba741386ebb1f5c5de0d2609385f85b5c7: Status 404 returned error can't find the container with id 2f7854a5a8d44b4f3f92e4da2e6eccba741386ebb1f5c5de0d2609385f85b5c7 Apr 23 13:32:15.353205 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:32:15.353186 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde702b67_cf80_4b1f_b30b_e4a459ac038e.slice/crio-e884fa5223f51b6fea4a9f4ff469faeae0ec3d088addd7512de93e5ffcfda240 WatchSource:0}: Error finding container e884fa5223f51b6fea4a9f4ff469faeae0ec3d088addd7512de93e5ffcfda240: Status 404 returned error can't find the container with id e884fa5223f51b6fea4a9f4ff469faeae0ec3d088addd7512de93e5ffcfda240 Apr 23 13:32:15.531891 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.531721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:15.531891 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.531879 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.532100 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.531978 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:16.531933402 +0000 UTC m=+3.075677599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:15.632148 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.632117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:15.632315 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.632277 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:15.632315 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.632300 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:15.632315 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.632313 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.632475 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:15.632374 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:16.632356009 +0000 UTC m=+3.176100208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:15.680217 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.679989 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:15.959246 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.959124 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:14 +0000 UTC" deadline="2027-10-08 21:56:58.542532366 +0000 UTC" Apr 23 13:32:15.959246 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.959156 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12800h24m42.583379627s" Apr 23 13:32:15.997220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:15.997188 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:16.078867 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.078784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7jsd5" event={"ID":"e2865e96-7bac-4087-bb1b-0cf266b4deb0","Type":"ContainerStarted","Data":"77110bdaf8f5e61e61eb9f128b303384b3af04e349df9385a4d0636dbd6c17f1"} Apr 23 13:32:16.085963 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.085874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" event={"ID":"047f80fa-7458-4de1-b0e4-f52fea4fbe72","Type":"ContainerStarted","Data":"e17dc6f347a6954c83102941816dda31ad4c5f297b64ad57620eb614b326cf52"} Apr 23 13:32:16.120791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.120757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dpm52" event={"ID":"5762a240-1436-4c52-bead-2abd75c01895","Type":"ContainerStarted","Data":"b4d6513621782c3843ced390c3bb09092227a5a9cf82722a3d092067fb51f2e6"} Apr 23 13:32:16.139492 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.139397 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"e74705716cf0e0bb309ac8f97bec59cc1fb23e06ba9f6b6cc080ca021fc60757"} Apr 23 13:32:16.157706 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.157622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qqkv" event={"ID":"6584cca1-f6ed-4d94-8644-5eb9b59e13e6","Type":"ContainerStarted","Data":"3899c33c325596b17561e0d89a6d46f543ee4639752af7bcbd533483e17fc324"} Apr 23 13:32:16.175283 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.174998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7dgj" event={"ID":"de702b67-cf80-4b1f-b30b-e4a459ac038e","Type":"ContainerStarted","Data":"e884fa5223f51b6fea4a9f4ff469faeae0ec3d088addd7512de93e5ffcfda240"} Apr 23 13:32:16.187855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.187824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" event={"ID":"5941c699-21b4-4722-baea-e35ca0811594","Type":"ContainerStarted","Data":"2f7854a5a8d44b4f3f92e4da2e6eccba741386ebb1f5c5de0d2609385f85b5c7"} Apr 23 13:32:16.204426 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.204237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerStarted","Data":"bd632bd9147320140804d25ad25a0dd254cdfb7a3c87736966dbff3f808e6100"} Apr 23 13:32:16.208002 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.207699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6kpm" event={"ID":"e662fef7-fd2a-4a55-91de-e3ed361dab06","Type":"ContainerStarted","Data":"676611b6df1ff1e07dade57e0be3ed3e5ca39f96cce716bd0c0b3dedb6484efa"} Apr 23 13:32:16.240349 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.240253 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:32:16.540167 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.540002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:16.540334 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.540177 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:16.540334 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.540252 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.540229517 +0000 UTC m=+5.083973712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:16.640982 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.640934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:16.641186 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.641174 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:16.641248 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.641197 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:16.641248 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.641211 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:16.641339 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:16.641275 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:18.641256478 +0000 UTC m=+5.185000676 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:16.960222 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.960082 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:27:14 +0000 UTC" deadline="2027-10-10 20:33:52.72368272 +0000 UTC" Apr 23 13:32:16.960222 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:16.960129 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12847h1m35.763557297s" Apr 23 13:32:17.057543 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:17.057485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:17.057706 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:17.057656 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:17.058158 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:17.058140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:17.058248 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:17.058230 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:18.561936 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:18.561892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:18.562371 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.562066 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:18.562371 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.562133 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.562113335 +0000 UTC m=+9.105857542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:18.662722 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:18.662685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:18.662892 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.662850 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:18.662892 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.662868 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:18.662892 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.662880 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:18.663062 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:18.662939 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.66292125 +0000 UTC m=+9.206665457 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:19.056719 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:19.056610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:19.056889 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:19.056748 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:19.056966 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:19.056610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:19.057086 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:19.057058 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:21.056810 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:21.056766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:21.057294 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:21.056766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:21.057294 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:21.056916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:21.057294 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:21.056996 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:22.596005 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:22.595914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:22.596531 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.596029 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.596531 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.596088 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.596069862 +0000 UTC m=+17.139814060 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:22.696365 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:22.696292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:22.696553 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.696468 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:22.696553 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.696495 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:22.696553 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.696524 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:22.696726 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:22.696588 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:30.696566425 +0000 UTC m=+17.240310623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:23.056957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:23.056851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:23.057106 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:23.056851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:23.057106 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:23.056999 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:23.057233 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:23.057128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:25.056674 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:25.056635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:25.057091 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:25.056635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:25.057091 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:25.056760 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:25.057091 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:25.056850 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:27.057426 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:27.057385 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:27.057998 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:27.057395 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:27.057998 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:27.057562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:27.057998 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:27.057630 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:29.056604 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:29.056567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:29.057059 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:29.056695 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:29.057059 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:29.056745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:29.057059 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:29.056866 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:30.652909 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:30.652871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:30.653383 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.653043 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:30.653383 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.653129 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.653107678 +0000 UTC m=+33.196851873 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:30.754261 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:30.754228 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:30.754458 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.754419 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:30.754458 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.754446 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:30.754458 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.754459 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:30.754640 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:30.754543 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.754520373 +0000 UTC m=+33.298264580 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:31.056474 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:31.056391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:31.056629 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:31.056391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:31.056629 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:31.056527 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:31.056629 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:31.056609 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:33.057345 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:33.057309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:33.057761 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:33.057309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:33.057761 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:33.057448 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:33.057761 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:33.057541 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:34.246649 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.246378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" event={"ID":"06e1c4c209f8543cc577f01ef69cf08e","Type":"ContainerStarted","Data":"c1c712890ef65b5b71d5fec4c102c72b1331582787bfd43e0423dbbacb295383"} Apr 23 13:32:34.251176 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.251137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" event={"ID":"047f80fa-7458-4de1-b0e4-f52fea4fbe72","Type":"ContainerStarted","Data":"112487de92c4dee2fa87b885f7c0ec259e9a518e2e92c1074165f9b2b90d8e87"} Apr 23 13:32:34.255673 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"13207257aef007faa1eaee2acfd512a89d1f90d13edbf6d7bbe5c12cec5fa5a7"} Apr 23 13:32:34.255826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"c2f1d685f09fbb2f16201cb4a4f6350dd8c91067d86ee77ae37b3d407634c88f"} Apr 23 13:32:34.255826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255691 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"bebdca6f43d2c61042ca23e5b5d3052e8dee1766a5150d1c7494d5308a236897"} Apr 23 13:32:34.255826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"272ab921c2a6baa7fcabd889ebd5e1e9ece1ee7ac8cb72da3e03aee8a50d8676"} Apr 23 13:32:34.255826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"ae74e82db060769bfb8d43e55c243768b31652ff2688ac3b0da07ce55f798d4d"} Apr 23 13:32:34.255826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.255737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"99c80010f8c567bf9c70340f45f6d0a9cd52635de7e18c32bbe492e48dd5ce7b"} Apr 23 13:32:34.257543 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.257496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qqkv" event={"ID":"6584cca1-f6ed-4d94-8644-5eb9b59e13e6","Type":"ContainerStarted","Data":"90c79c5182d02a75f9292c8318760578b400b352ddaada98b757ed340c083cf6"} Apr 23 13:32:34.263443 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.263371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-177.ec2.internal" podStartSLOduration=20.263353315 podStartE2EDuration="20.263353315s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:34.262740066 +0000 UTC m=+20.806484285" watchObservedRunningTime="2026-04-23 13:32:34.263353315 +0000 UTC m=+20.807097531" Apr 23 13:32:34.281579 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.281460 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mcpv6" podStartSLOduration=2.368950778 podStartE2EDuration="20.281441677s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.294238611 +0000 UTC m=+1.837982807" lastFinishedPulling="2026-04-23 13:32:33.206729513 +0000 UTC m=+19.750473706" observedRunningTime="2026-04-23 13:32:34.281207981 +0000 UTC m=+20.824952224" watchObservedRunningTime="2026-04-23 13:32:34.281441677 +0000 UTC m=+20.825185895" Apr 23 13:32:34.298823 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:34.298772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7qqkv" podStartSLOduration=2.30287424 podStartE2EDuration="20.29875805s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.241653924 +0000 UTC m=+1.785398116" lastFinishedPulling="2026-04-23 13:32:33.237537727 +0000 UTC m=+19.781281926" observedRunningTime="2026-04-23 13:32:34.298315912 +0000 UTC m=+20.842060128" watchObservedRunningTime="2026-04-23 13:32:34.29875805 +0000 UTC m=+20.842502265" Apr 23 13:32:35.057401 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.057309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:35.057584 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.057309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:35.057584 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:35.057454 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:35.057584 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:35.057548 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:35.242856 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.242702 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:32:35.260424 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.260389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dpm52" event={"ID":"5762a240-1436-4c52-bead-2abd75c01895","Type":"ContainerStarted","Data":"a95ef2159225e1d0f8fb704a1f08fef0ca15c5ee40e693126c9d2939fbdb7e5c"} Apr 23 13:32:35.261648 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.261624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n7dgj" event={"ID":"de702b67-cf80-4b1f-b30b-e4a459ac038e","Type":"ContainerStarted","Data":"5a238b3142ec70ba68169928fd65774f2124b0a00d7dd948c2b9bc3e2a6f9a7c"} Apr 23 13:32:35.263020 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.262998 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" event={"ID":"5941c699-21b4-4722-baea-e35ca0811594","Type":"ContainerStarted","Data":"8df5b28d67df256efb947a7e42c8bb231748d3f497f9d5af137d4cf06f3a5a43"} Apr 23 13:32:35.263111 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.263026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" event={"ID":"5941c699-21b4-4722-baea-e35ca0811594","Type":"ContainerStarted","Data":"aac0b6e348083d7841fbcb8feb3ff2f82bf63fc2ae619c417ba88630fe4e9d3c"} Apr 23 13:32:35.265150 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.264842 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="6611831bd3217db7ebf06e4a27010a422f5dbebf1ea103c19048cfd0782c3cac" exitCode=0 Apr 23 13:32:35.265440 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.265408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"6611831bd3217db7ebf06e4a27010a422f5dbebf1ea103c19048cfd0782c3cac"} Apr 23 13:32:35.266735 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.266710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v6kpm" event={"ID":"e662fef7-fd2a-4a55-91de-e3ed361dab06","Type":"ContainerStarted","Data":"d4d809bc304c70a363ed32968921b90be8e7f934fddb1a84a0609c0d15d6b287"} Apr 23 13:32:35.268476 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.268376 2576 generic.go:358] "Generic (PLEG): container finished" podID="e44658396063ef3d364eee1cf1f44e80" containerID="d8ac8ef6a47e62ba994ef1bb86f5295eaaf1b5ecb06dd7e9b9979c50c22880dd" exitCode=0 Apr 23 13:32:35.268476 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.268458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" event={"ID":"e44658396063ef3d364eee1cf1f44e80","Type":"ContainerDied","Data":"d8ac8ef6a47e62ba994ef1bb86f5295eaaf1b5ecb06dd7e9b9979c50c22880dd"} Apr 23 13:32:35.269723 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.269689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7jsd5" event={"ID":"e2865e96-7bac-4087-bb1b-0cf266b4deb0","Type":"ContainerStarted","Data":"6b26a2c2e225a00652273ee0c25dcee583cde00c61c1bdb7785ae36c21a16247"} Apr 23 13:32:35.281214 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.281162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dpm52" podStartSLOduration=3.384066179 podStartE2EDuration="21.281139847s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.278351142 +0000 UTC m=+1.822095334" lastFinishedPulling="2026-04-23 13:32:33.175424806 +0000 UTC m=+19.719169002" observedRunningTime="2026-04-23 13:32:35.280861515 +0000 UTC m=+21.824605731" watchObservedRunningTime="2026-04-23 13:32:35.281139847 +0000 UTC m=+21.824884061" Apr 23 13:32:35.310463 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.310369 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n7dgj" podStartSLOduration=3.489941241 podStartE2EDuration="21.31035281s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.354994373 +0000 UTC m=+1.898738566" lastFinishedPulling="2026-04-23 13:32:33.175405935 +0000 UTC m=+19.719150135" observedRunningTime="2026-04-23 13:32:35.295829742 +0000 UTC m=+21.839573957" watchObservedRunningTime="2026-04-23 13:32:35.31035281 +0000 UTC m=+21.854097025" Apr 23 13:32:35.332657 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.332595 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v6kpm" podStartSLOduration=3.440871665 podStartE2EDuration="21.332576252s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.312729057 +0000 UTC m=+1.856473250" lastFinishedPulling="2026-04-23 13:32:33.204433637 +0000 UTC m=+19.748177837" observedRunningTime="2026-04-23 13:32:35.332142453 +0000 UTC m=+21.875886667" watchObservedRunningTime="2026-04-23 13:32:35.332576252 +0000 UTC m=+21.876320468" Apr 23 13:32:35.366973 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.366930 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7jsd5" podStartSLOduration=3.5352909329999997 podStartE2EDuration="21.366915826s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.343787999 +0000 UTC m=+1.887532192" lastFinishedPulling="2026-04-23 13:32:33.175412892 +0000 UTC m=+19.719157085" observedRunningTime="2026-04-23 13:32:35.366787051 +0000 UTC m=+21.910531280" watchObservedRunningTime="2026-04-23 13:32:35.366915826 +0000 UTC m=+21.910660040" Apr 23 13:32:35.991039 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.990907 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:32:35.242851795Z","UUID":"e929e1bd-bb24-460d-bd3d-fa258f100623","Handler":null,"Name":"","Endpoint":""} Apr 23 13:32:35.993737 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.993710 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:32:35.993878 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:35.993748 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:32:36.277588 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:36.277529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" event={"ID":"e44658396063ef3d364eee1cf1f44e80","Type":"ContainerStarted","Data":"f26383f48e8c3ca2034149010f3ab199ccc51cce24817cceef6ca5d200888bb8"} Apr 23 13:32:36.281447 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:36.281417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"85ea56f8491618a9faf4f316e4eeeb577d29c55ce9df10a7f82c78f3b9177670"} Apr 23 13:32:36.283617 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:36.283591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" event={"ID":"5941c699-21b4-4722-baea-e35ca0811594","Type":"ContainerStarted","Data":"90a463c034f1cb1ee187aab75f6c6201e554c6226e8820e929abc84117a50500"} Apr 23 13:32:36.293689 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:36.293640 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-177.ec2.internal" podStartSLOduration=22.293624007 podStartE2EDuration="22.293624007s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:32:36.293007377 +0000 UTC m=+22.836751593" watchObservedRunningTime="2026-04-23 13:32:36.293624007 +0000 UTC m=+22.837368223" Apr 23 13:32:36.312333 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:36.312276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m7xq9" podStartSLOduration=1.55537982 podStartE2EDuration="22.31225993s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.350453395 +0000 UTC m=+1.894197601" lastFinishedPulling="2026-04-23 13:32:36.107333511 +0000 UTC m=+22.651077711" observedRunningTime="2026-04-23 13:32:36.311578777 +0000 UTC m=+22.855322993" watchObservedRunningTime="2026-04-23 13:32:36.31225993 +0000 UTC m=+22.856004179" Apr 23 13:32:37.056669 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:37.056623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:37.056885 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:37.056639 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:37.056885 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:37.056755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:37.056885 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:37.056825 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:39.057219 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:39.057185 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:39.057824 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:39.057298 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:39.057824 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:39.057356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:39.057824 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:39.057482 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:39.293038 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:39.292583 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" event={"ID":"2b98e81e-dc6f-4d15-b8ec-77a01c0ee951","Type":"ContainerStarted","Data":"a5bca0f3762548385364a6cae333ea2cf4b046f1ef0f2fc080d0ed4162c27502"} Apr 23 13:32:39.874732 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:39.874653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:39.875146 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:39.875131 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:40.296076 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.296039 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="a2dc19fb82c44cb401c9a47ca03408843fb5c080cd9ca964cc149dddf5d40c0e" exitCode=0 Apr 23 13:32:40.296469 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.296118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"a2dc19fb82c44cb401c9a47ca03408843fb5c080cd9ca964cc149dddf5d40c0e"} Apr 23 13:32:40.296723 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.296705 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:40.296799 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.296732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:40.311906 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.311882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:40.359323 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:40.359274 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" podStartSLOduration=7.981541548 podStartE2EDuration="26.359257791s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.268735009 +0000 UTC m=+1.812479202" lastFinishedPulling="2026-04-23 13:32:33.646451248 +0000 UTC m=+20.190195445" observedRunningTime="2026-04-23 13:32:40.357560775 +0000 UTC m=+26.901304990" watchObservedRunningTime="2026-04-23 13:32:40.359257791 +0000 UTC m=+26.903002006" Apr 23 13:32:41.056560 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.056500 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:41.056692 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:41.056664 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:41.056791 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.056763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:41.056910 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:41.056890 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:41.250237 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.250028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:41.250702 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.250662 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dpm52" Apr 23 13:32:41.299173 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.299143 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="dd9e25322d8c5617dfba305c680fdb7ea7997f3e7d04271b651056bfa48d5550" exitCode=0 Apr 23 13:32:41.299715 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.299227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"dd9e25322d8c5617dfba305c680fdb7ea7997f3e7d04271b651056bfa48d5550"} Apr 23 13:32:41.300144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.299963 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:41.314319 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.314290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:32:41.482163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.482120 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fnd8j"] Apr 23 13:32:41.482306 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.482252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:41.482357 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:41.482338 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:41.482988 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.482959 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wzp5m"] Apr 23 13:32:41.483108 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:41.483058 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:41.483163 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:41.483138 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:42.303057 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:42.303024 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="f7c843e8d007a2e391194a95001ab96adbcf1a3d938998ec4c315bf4734488f7" exitCode=0 Apr 23 13:32:42.303565 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:42.303152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"f7c843e8d007a2e391194a95001ab96adbcf1a3d938998ec4c315bf4734488f7"} Apr 23 13:32:43.056780 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:43.056739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:43.056923 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:43.056740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:43.056923 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:43.056873 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:43.057038 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:43.056925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:45.056993 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.056957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:45.057612 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.056972 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:45.057612 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:45.057083 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnd8j" podUID="73b441a9-2b94-42af-ba5d-7d626ce72613" Apr 23 13:32:45.057612 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:45.057196 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:32:45.768328 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.768300 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-177.ec2.internal" event="NodeReady" Apr 23 13:32:45.768496 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.768441 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:32:45.816562 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.816392 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6zrlv"] Apr 23 13:32:45.844171 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.844143 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5s4f6"] Apr 23 13:32:45.844315 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.844302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:45.847640 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.847607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:32:45.847901 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.847883 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:32:45.848142 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.848124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:32:45.860437 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.860403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zrlv"] Apr 23 13:32:45.860437 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.860430 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5s4f6"] Apr 23 13:32:45.860642 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.860556 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:45.863418 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.863398 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:32:45.863543 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.863450 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:32:45.863543 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.863539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:32:45.863643 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.863579 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:32:45.968421 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpndb\" (UniqueName: \"kubernetes.io/projected/d6361136-0129-4e11-9891-a7117fbe5be5-kube-api-access-bpndb\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:45.968605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8ght\" (UniqueName: \"kubernetes.io/projected/24201bdd-9893-495a-8f70-680500f3a31d-kube-api-access-g8ght\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:45.968605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:45.968605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:45.968605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6361136-0129-4e11-9891-a7117fbe5be5-config-volume\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:45.968605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:45.968602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6361136-0129-4e11-9891-a7117fbe5be5-tmp-dir\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.069725 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.069686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:46.070161 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.069736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6361136-0129-4e11-9891-a7117fbe5be5-config-volume\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.070161 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.069774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6361136-0129-4e11-9891-a7117fbe5be5-tmp-dir\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.070161 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.069813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpndb\" (UniqueName: \"kubernetes.io/projected/d6361136-0129-4e11-9891-a7117fbe5be5-kube-api-access-bpndb\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.070161 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.069847 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:46.070161 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.069932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.569911518 +0000 UTC m=+33.113655719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:32:46.070426 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.070230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8ght\" (UniqueName: \"kubernetes.io/projected/24201bdd-9893-495a-8f70-680500f3a31d-kube-api-access-g8ght\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:46.070426 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.070262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.070426 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.070287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6361136-0129-4e11-9891-a7117fbe5be5-tmp-dir\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.070426 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.070386 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:46.070634 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.070434 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:46.570418881 +0000 UTC m=+33.114163077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:32:46.070634 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.070435 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d6361136-0129-4e11-9891-a7117fbe5be5-config-volume\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.083192 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.083160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8ght\" (UniqueName: \"kubernetes.io/projected/24201bdd-9893-495a-8f70-680500f3a31d-kube-api-access-g8ght\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:46.094552 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.094524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpndb\" (UniqueName: \"kubernetes.io/projected/d6361136-0129-4e11-9891-a7117fbe5be5-kube-api-access-bpndb\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.574362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.574317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:46.574597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.574425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:46.574597 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.574476 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:46.574597 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.574524 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:46.574597 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.574560 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:47.574541236 +0000 UTC m=+34.118285444 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:32:46.574597 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.574577 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:47.574569994 +0000 UTC m=+34.118314187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:32:46.675361 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.675317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:46.675552 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.675471 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:46.675623 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.675568 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:18.675552727 +0000 UTC m=+65.219296925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:32:46.776156 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:46.776121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:46.776335 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.776283 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:32:46.776335 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.776307 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:32:46.776335 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.776318 2576 projected.go:194] Error preparing data for projected volume kube-api-access-xcs7b for pod openshift-network-diagnostics/network-check-target-fnd8j: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:46.776447 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:46.776390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b podName:73b441a9-2b94-42af-ba5d-7d626ce72613 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:18.77637624 +0000 UTC m=+65.320120449 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xcs7b" (UniqueName: "kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b") pod "network-check-target-fnd8j" (UID: "73b441a9-2b94-42af-ba5d-7d626ce72613") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:32:47.056522 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.056462 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:32:47.056734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.056467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:32:47.061972 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.061944 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:47.062131 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.062006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:32:47.062131 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.062035 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:32:47.062131 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.061947 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:47.062598 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.062577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:47.582390 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.582348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:47.582879 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:47.582419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:47.582879 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:47.582523 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:47.582879 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:47.582547 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:47.582879 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:47.582600 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.582577691 +0000 UTC m=+36.126321887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:32:47.582879 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:47.582624 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:49.582612938 +0000 UTC m=+36.126357140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:32:49.318521 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:49.318481 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="8257e5e3dd979204b34d1f6eaf24d4d0771a7c5d4d201d822f75c6e961fbdf1c" exitCode=0 Apr 23 13:32:49.318913 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:49.318540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"8257e5e3dd979204b34d1f6eaf24d4d0771a7c5d4d201d822f75c6e961fbdf1c"} Apr 23 13:32:49.599241 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:49.599151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:49.599241 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:49.599201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:49.599533 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:49.599300 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:49.599533 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:49.599310 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:49.599533 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:49.599354 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.599339846 +0000 UTC m=+40.143084040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:32:49.599533 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:49.599369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:53.599362028 +0000 UTC m=+40.143106221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:32:50.323067 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:50.323037 2576 generic.go:358] "Generic (PLEG): container finished" podID="8b934c24-9a04-47cb-a0a9-ce2109c8b735" containerID="f27dfd03a9935956a699cb9853639648c98379c1a4fd8767868b7577740c81fd" exitCode=0 Apr 23 13:32:50.323467 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:50.323077 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerDied","Data":"f27dfd03a9935956a699cb9853639648c98379c1a4fd8767868b7577740c81fd"} Apr 23 13:32:51.327696 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:51.327657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hhztr" event={"ID":"8b934c24-9a04-47cb-a0a9-ce2109c8b735","Type":"ContainerStarted","Data":"c3b30b6af692f3e483436588594f6f8d9196ea438c32f622499d0207f784929e"} Apr 23 13:32:51.351229 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:51.351184 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hhztr" podStartSLOduration=4.47601406 podStartE2EDuration="37.351168871s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:32:15.332387747 +0000 UTC m=+1.876131940" lastFinishedPulling="2026-04-23 13:32:48.207542558 +0000 UTC m=+34.751286751" observedRunningTime="2026-04-23 13:32:51.349430618 +0000 UTC m=+37.893174824" watchObservedRunningTime="2026-04-23 13:32:51.351168871 +0000 UTC m=+37.894913085" Apr 23 13:32:53.625965 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:53.625927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:32:53.625965 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:32:53.625977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:32:53.626440 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:53.626068 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:53.626440 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:53.626075 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:53.626440 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:53.626123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.626106615 +0000 UTC m=+48.169850808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:32:53.626440 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:32:53.626135 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:01.626130118 +0000 UTC m=+48.169874311 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:33:01.679987 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:01.679941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:33:01.680495 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:01.680004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:33:01.680495 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:01.680094 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:01.680495 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:01.680113 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:01.680495 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:01.680160 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:17.680143691 +0000 UTC m=+64.223887885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:33:01.680495 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:01.680175 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:33:17.680169066 +0000 UTC m=+64.223913259 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:33:13.315078 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:13.315050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lhgvj" Apr 23 13:33:17.684266 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:17.684211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:33:17.684758 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:17.684367 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:17.684758 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:17.684400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:33:17.684758 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:17.684433 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:33:49.684418312 +0000 UTC m=+96.228162505 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:33:17.684758 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:17.684555 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:17.684758 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:17.684627 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:49.684611127 +0000 UTC m=+96.228355340 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:33:18.691883 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.691839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:33:18.695140 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.695121 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:33:18.702291 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:18.702268 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:18.702352 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:18.702335 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.702318391 +0000 UTC m=+129.246062584 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : secret "metrics-daemon-secret" not found Apr 23 13:33:18.792711 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.792674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:33:18.795694 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.795674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:33:18.805498 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.805469 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:33:18.817025 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.816986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcs7b\" (UniqueName: \"kubernetes.io/projected/73b441a9-2b94-42af-ba5d-7d626ce72613-kube-api-access-xcs7b\") pod \"network-check-target-fnd8j\" (UID: \"73b441a9-2b94-42af-ba5d-7d626ce72613\") " pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:33:18.872987 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.872950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jcmp5\"" Apr 23 13:33:18.880303 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:18.880278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:33:19.052096 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:19.052048 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fnd8j"] Apr 23 13:33:19.056104 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:33:19.056073 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b441a9_2b94_42af_ba5d_7d626ce72613.slice/crio-14756ac9889b3b4837749a33d558b24b03527b76f9a50f4a6acc5fc73a0976dd WatchSource:0}: Error finding container 14756ac9889b3b4837749a33d558b24b03527b76f9a50f4a6acc5fc73a0976dd: Status 404 returned error can't find the container with id 14756ac9889b3b4837749a33d558b24b03527b76f9a50f4a6acc5fc73a0976dd Apr 23 13:33:19.376074 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:19.376034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fnd8j" event={"ID":"73b441a9-2b94-42af-ba5d-7d626ce72613","Type":"ContainerStarted","Data":"14756ac9889b3b4837749a33d558b24b03527b76f9a50f4a6acc5fc73a0976dd"} Apr 23 13:33:22.383308 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:22.383270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fnd8j" event={"ID":"73b441a9-2b94-42af-ba5d-7d626ce72613","Type":"ContainerStarted","Data":"8b03215e61a61b21aa25abaa18dd4f8c3c3d8baefd166e349bbd1bf9a3ca75ca"} Apr 23 13:33:22.383725 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:22.383498 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:33:22.400580 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:22.400535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fnd8j" podStartSLOduration=65.690269318 podStartE2EDuration="1m8.400500341s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:33:19.057973482 +0000 UTC m=+65.601717675" lastFinishedPulling="2026-04-23 13:33:21.7682045 +0000 UTC m=+68.311948698" observedRunningTime="2026-04-23 13:33:22.399619996 +0000 UTC m=+68.943364223" watchObservedRunningTime="2026-04-23 13:33:22.400500341 +0000 UTC m=+68.944244552" Apr 23 13:33:49.702425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:49.702374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:33:49.702425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:49.702437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:33:49.702965 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:49.702529 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:49.702965 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:49.702531 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:49.702965 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:49.702582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls podName:d6361136-0129-4e11-9891-a7117fbe5be5 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:53.702568395 +0000 UTC m=+160.246312588 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls") pod "dns-default-6zrlv" (UID: "d6361136-0129-4e11-9891-a7117fbe5be5") : secret "dns-default-metrics-tls" not found Apr 23 13:33:49.702965 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:33:49.702595 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert podName:24201bdd-9893-495a-8f70-680500f3a31d nodeName:}" failed. No retries permitted until 2026-04-23 13:34:53.702589575 +0000 UTC m=+160.246333768 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert") pod "ingress-canary-5s4f6" (UID: "24201bdd-9893-495a-8f70-680500f3a31d") : secret "canary-serving-cert" not found Apr 23 13:33:53.387980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:33:53.387948 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fnd8j" Apr 23 13:34:06.856818 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.856787 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz"] Apr 23 13:34:06.859546 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.859530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:06.863263 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.862959 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 13:34:06.863263 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.863248 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.863445 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.863312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-v5w48\"" Apr 23 13:34:06.864030 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.864010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 13:34:06.864124 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.864057 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.876455 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.876432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz"] Apr 23 13:34:06.915597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.915563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:06.915769 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.915629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kddxx\" (UniqueName: \"kubernetes.io/projected/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-kube-api-access-kddxx\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:06.915769 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.915662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:06.961703 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.961666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s48s8"] Apr 23 13:34:06.964678 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.964661 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4"] Apr 23 13:34:06.964827 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.964807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:06.967493 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.967472 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6f688cbc9d-w6cb7"] Apr 23 13:34:06.967670 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.967649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:06.970069 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.970052 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:06.970457 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.970438 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-8n42z\"" Apr 23 13:34:06.971602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.971586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 13:34:06.972314 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.972289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.972457 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.972317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-m7ct6\"" Apr 23 13:34:06.973010 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.972993 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.977362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.977340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.977444 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.977420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 13:34:06.977806 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.977787 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 13:34:06.977889 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.977805 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.978032 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.978010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 13:34:06.978783 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.978757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s48s8"] Apr 23 13:34:06.981159 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.981142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 13:34:06.981836 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.981821 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 13:34:06.981929 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.981914 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 13:34:06.982104 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.982088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 13:34:06.982175 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.982134 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fhs2q\"" Apr 23 13:34:06.982223 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.982192 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 13:34:06.982378 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.982362 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 13:34:06.982636 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.982614 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 13:34:06.984099 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.984080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4"] Apr 23 13:34:06.986726 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:06.986707 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6f688cbc9d-w6cb7"] Apr 23 13:34:07.016008 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.015980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.016109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-trusted-ca\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.016109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/becd3753-8920-40b9-bbff-58dc7e26e9b4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.016109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.016109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nzsm\" (UniqueName: \"kubernetes.io/projected/becd3753-8920-40b9-bbff-58dc7e26e9b4-kube-api-access-6nzsm\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.016243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-stats-auth\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.016243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016154 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-config\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.016243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5gr\" (UniqueName: \"kubernetes.io/projected/a88a928d-64c1-4284-b688-d0ec2e231c16-kube-api-access-rv5gr\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.016243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.016243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeb729e-46fa-42be-8d0f-9045eabfad26-serving-cert\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.016411 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-default-certificate\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.016411 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.016411 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kddxx\" (UniqueName: \"kubernetes.io/projected/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-kube-api-access-kddxx\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.016547 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn98x\" (UniqueName: \"kubernetes.io/projected/9aeb729e-46fa-42be-8d0f-9045eabfad26-kube-api-access-jn98x\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.016595 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.016754 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.016731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.018288 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.018267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.029002 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.028978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kddxx\" (UniqueName: \"kubernetes.io/projected/0b1e6cd3-17a8-4f73-b12c-4f3725a10c29-kube-api-access-kddxx\") pod \"kube-storage-version-migrator-operator-6769c5d45-r59bz\" (UID: \"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.082194 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.082159 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tpxdm"] Apr 23 13:34:07.085136 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.085120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.089874 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.089852 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 13:34:07.089981 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.089934 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:07.090792 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.090773 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 13:34:07.090890 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.090805 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-4fk8w\"" Apr 23 13:34:07.091302 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.091282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:07.098291 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.098263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 13:34:07.103657 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.103637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tpxdm"] Apr 23 13:34:07.116960 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.116907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeb729e-46fa-42be-8d0f-9045eabfad26-serving-cert\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.116960 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.116935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-default-certificate\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.116960 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.116959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.117131 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.117070 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:07.617052784 +0000 UTC m=+114.160796997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:07.117192 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn98x\" (UniqueName: \"kubernetes.io/projected/9aeb729e-46fa-42be-8d0f-9045eabfad26-kube-api-access-jn98x\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.117192 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.117293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-trusted-ca\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.117293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/becd3753-8920-40b9-bbff-58dc7e26e9b4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.117293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.117473 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.117362 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:07.117473 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.117450 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:07.617432268 +0000 UTC m=+114.161176474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : secret "router-metrics-certs-default" not found Apr 23 13:34:07.117602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nzsm\" (UniqueName: \"kubernetes.io/projected/becd3753-8920-40b9-bbff-58dc7e26e9b4-kube-api-access-6nzsm\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.117602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-stats-auth\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.117602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-config\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.117746 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.117602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5gr\" (UniqueName: \"kubernetes.io/projected/a88a928d-64c1-4284-b688-d0ec2e231c16-kube-api-access-rv5gr\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.118002 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.117979 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:07.118089 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.118050 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:07.61803309 +0000 UTC m=+114.161777291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:07.118459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.118434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-trusted-ca\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.118701 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.118682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeb729e-46fa-42be-8d0f-9045eabfad26-config\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.118821 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.118805 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/becd3753-8920-40b9-bbff-58dc7e26e9b4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.119844 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.119819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeb729e-46fa-42be-8d0f-9045eabfad26-serving-cert\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.119993 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.119973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-default-certificate\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.120147 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.120134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-stats-auth\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.130835 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.130811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nzsm\" (UniqueName: \"kubernetes.io/projected/becd3753-8920-40b9-bbff-58dc7e26e9b4-kube-api-access-6nzsm\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.131114 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.131085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5gr\" (UniqueName: \"kubernetes.io/projected/a88a928d-64c1-4284-b688-d0ec2e231c16-kube-api-access-rv5gr\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.132609 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.132590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn98x\" (UniqueName: \"kubernetes.io/projected/9aeb729e-46fa-42be-8d0f-9045eabfad26-kube-api-access-jn98x\") pod \"console-operator-9d4b6777b-s48s8\" (UID: \"9aeb729e-46fa-42be-8d0f-9045eabfad26\") " pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.170847 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.170823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" Apr 23 13:34:07.218970 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.218939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rtzr\" (UniqueName: \"kubernetes.io/projected/60898bb3-109a-472a-a90e-9b1a908a6d36-kube-api-access-9rtzr\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.219119 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.219012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-snapshots\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.219119 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.219038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-service-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.219234 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.219180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.219288 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.219253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60898bb3-109a-472a-a90e-9b1a908a6d36-serving-cert\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.219340 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.219288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-tmp\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.276121 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.276092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:07.283521 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.283474 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz"] Apr 23 13:34:07.286955 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:07.286930 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b1e6cd3_17a8_4f73_b12c_4f3725a10c29.slice/crio-516af05132c740aadcc534917b8186ce6a1ea1124f799bcd88b2e76642092e41 WatchSource:0}: Error finding container 516af05132c740aadcc534917b8186ce6a1ea1124f799bcd88b2e76642092e41: Status 404 returned error can't find the container with id 516af05132c740aadcc534917b8186ce6a1ea1124f799bcd88b2e76642092e41 Apr 23 13:34:07.320207 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.319722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-snapshots\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.320207 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.320007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-service-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.320207 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.320143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.320207 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.320185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60898bb3-109a-472a-a90e-9b1a908a6d36-serving-cert\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.320435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.320216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-tmp\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.320435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.320272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rtzr\" (UniqueName: \"kubernetes.io/projected/60898bb3-109a-472a-a90e-9b1a908a6d36-kube-api-access-9rtzr\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.322012 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.321603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-tmp\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.322012 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.321777 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.322012 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.321873 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/60898bb3-109a-472a-a90e-9b1a908a6d36-service-ca-bundle\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.322012 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.321969 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/60898bb3-109a-472a-a90e-9b1a908a6d36-snapshots\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.324615 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.324590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60898bb3-109a-472a-a90e-9b1a908a6d36-serving-cert\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.330899 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.330875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rtzr\" (UniqueName: \"kubernetes.io/projected/60898bb3-109a-472a-a90e-9b1a908a6d36-kube-api-access-9rtzr\") pod \"insights-operator-585dfdc468-tpxdm\" (UID: \"60898bb3-109a-472a-a90e-9b1a908a6d36\") " pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.393846 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.393761 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-s48s8"] Apr 23 13:34:07.393846 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.393798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" Apr 23 13:34:07.398453 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:07.398426 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aeb729e_46fa_42be_8d0f_9045eabfad26.slice/crio-a8285ff06810b70f441049ebab613db2adf546ff4c5ae4c81de659147e9cbbe1 WatchSource:0}: Error finding container a8285ff06810b70f441049ebab613db2adf546ff4c5ae4c81de659147e9cbbe1: Status 404 returned error can't find the container with id a8285ff06810b70f441049ebab613db2adf546ff4c5ae4c81de659147e9cbbe1 Apr 23 13:34:07.471402 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.471369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" event={"ID":"9aeb729e-46fa-42be-8d0f-9045eabfad26","Type":"ContainerStarted","Data":"a8285ff06810b70f441049ebab613db2adf546ff4c5ae4c81de659147e9cbbe1"} Apr 23 13:34:07.472365 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.472339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" event={"ID":"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29","Type":"ContainerStarted","Data":"516af05132c740aadcc534917b8186ce6a1ea1124f799bcd88b2e76642092e41"} Apr 23 13:34:07.508460 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.508427 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-tpxdm"] Apr 23 13:34:07.512697 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:07.512673 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60898bb3_109a_472a_a90e_9b1a908a6d36.slice/crio-22fda266f767b728516ac1a7b61ca3b4113776c5b69d678b0631cf138a2de60d WatchSource:0}: Error finding container 22fda266f767b728516ac1a7b61ca3b4113776c5b69d678b0631cf138a2de60d: Status 404 returned error can't find the container with id 22fda266f767b728516ac1a7b61ca3b4113776c5b69d678b0631cf138a2de60d Apr 23 13:34:07.622284 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.622248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.622284 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.622290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:07.622461 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.622388 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:07.622461 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.622394 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:07.622461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:07.622418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:07.622461 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.622441 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:08.622427481 +0000 UTC m=+115.166171679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:07.622663 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.622494 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:08.622478218 +0000 UTC m=+115.166222419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : secret "router-metrics-certs-default" not found Apr 23 13:34:07.622663 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:07.622541 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:08.622531076 +0000 UTC m=+115.166275274 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:08.475890 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:08.475811 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" event={"ID":"60898bb3-109a-472a-a90e-9b1a908a6d36","Type":"ContainerStarted","Data":"22fda266f767b728516ac1a7b61ca3b4113776c5b69d678b0631cf138a2de60d"} Apr 23 13:34:08.632312 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:08.632264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:08.632473 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:08.632372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:08.632473 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:08.632417 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:08.632473 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:08.632449 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:10.63242559 +0000 UTC m=+117.176169788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:08.632645 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:08.632538 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:08.632645 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:08.632592 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:10.632575646 +0000 UTC m=+117.176319845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:08.632755 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:08.632652 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:08.632755 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:08.632685 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:10.632674209 +0000 UTC m=+117.176418404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : secret "router-metrics-certs-default" not found Apr 23 13:34:10.077847 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.077818 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb"] Apr 23 13:34:10.080534 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.080516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" Apr 23 13:34:10.083134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.083112 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-z59m5\"" Apr 23 13:34:10.090836 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.090814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb"] Apr 23 13:34:10.146398 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.146375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72xz\" (UniqueName: \"kubernetes.io/projected/03ff5b18-36df-4e34-93ea-5d57a4bf949b-kube-api-access-t72xz\") pod \"network-check-source-8894fc9bd-thrhb\" (UID: \"03ff5b18-36df-4e34-93ea-5d57a4bf949b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" Apr 23 13:34:10.247734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.247698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t72xz\" (UniqueName: \"kubernetes.io/projected/03ff5b18-36df-4e34-93ea-5d57a4bf949b-kube-api-access-t72xz\") pod \"network-check-source-8894fc9bd-thrhb\" (UID: \"03ff5b18-36df-4e34-93ea-5d57a4bf949b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" Apr 23 13:34:10.258962 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.258930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72xz\" (UniqueName: \"kubernetes.io/projected/03ff5b18-36df-4e34-93ea-5d57a4bf949b-kube-api-access-t72xz\") pod \"network-check-source-8894fc9bd-thrhb\" (UID: \"03ff5b18-36df-4e34-93ea-5d57a4bf949b\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" Apr 23 13:34:10.389022 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.388928 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" Apr 23 13:34:10.482753 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.482726 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/0.log" Apr 23 13:34:10.482915 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.482775 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aeb729e-46fa-42be-8d0f-9045eabfad26" containerID="2bd65c12799f2e57bfadce46147a2b7e381b9b1fa4c29fa3108bab20c49db9cd" exitCode=255 Apr 23 13:34:10.482915 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.482828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" event={"ID":"9aeb729e-46fa-42be-8d0f-9045eabfad26","Type":"ContainerDied","Data":"2bd65c12799f2e57bfadce46147a2b7e381b9b1fa4c29fa3108bab20c49db9cd"} Apr 23 13:34:10.483160 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.483135 2576 scope.go:117] "RemoveContainer" containerID="2bd65c12799f2e57bfadce46147a2b7e381b9b1fa4c29fa3108bab20c49db9cd" Apr 23 13:34:10.484332 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.484296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" event={"ID":"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29","Type":"ContainerStarted","Data":"e1237ba7d2c6aac7c1836477a77db09fb9a2809e92260d1d1f10f73e162dffcd"} Apr 23 13:34:10.486004 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.485981 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" event={"ID":"60898bb3-109a-472a-a90e-9b1a908a6d36","Type":"ContainerStarted","Data":"f7edd248bd0fab4d81e8bb70608f4dec487b9ff81aaac33af2eb46bc694b3123"} Apr 23 13:34:10.512278 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.512252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb"] Apr 23 13:34:10.515674 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.515617 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" podStartSLOduration=0.912321601 podStartE2EDuration="3.51560085s" podCreationTimestamp="2026-04-23 13:34:07 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.514408688 +0000 UTC m=+114.058152881" lastFinishedPulling="2026-04-23 13:34:10.117687938 +0000 UTC m=+116.661432130" observedRunningTime="2026-04-23 13:34:10.514972298 +0000 UTC m=+117.058716516" watchObservedRunningTime="2026-04-23 13:34:10.51560085 +0000 UTC m=+117.059345065" Apr 23 13:34:10.516654 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:10.516629 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ff5b18_36df_4e34_93ea_5d57a4bf949b.slice/crio-ec7358f6ebd35ffb2304ad0e8e812d6040d5f85d64884c76f5900bdaea34ff81 WatchSource:0}: Error finding container ec7358f6ebd35ffb2304ad0e8e812d6040d5f85d64884c76f5900bdaea34ff81: Status 404 returned error can't find the container with id ec7358f6ebd35ffb2304ad0e8e812d6040d5f85d64884c76f5900bdaea34ff81 Apr 23 13:34:10.540360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.539396 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" podStartSLOduration=1.716430476 podStartE2EDuration="4.539378011s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.288699076 +0000 UTC m=+113.832443276" lastFinishedPulling="2026-04-23 13:34:10.111646607 +0000 UTC m=+116.655390811" observedRunningTime="2026-04-23 13:34:10.539044807 +0000 UTC m=+117.082789023" watchObservedRunningTime="2026-04-23 13:34:10.539378011 +0000 UTC m=+117.083122227" Apr 23 13:34:10.651613 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.651526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:10.651613 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.651580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:10.651629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:10.651634 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:10.651681 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:10.651716 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:14.651693501 +0000 UTC m=+121.195437717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : secret "router-metrics-certs-default" not found Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:10.651735 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:14.651727006 +0000 UTC m=+121.195471198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:10.651844 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:10.651751 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:14.651743794 +0000 UTC m=+121.195487987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:11.489964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.489928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" event={"ID":"03ff5b18-36df-4e34-93ea-5d57a4bf949b","Type":"ContainerStarted","Data":"b3a8295acd8914122a1a02a0a30123408c3d3c800512df9839d6d61fda0c070e"} Apr 23 13:34:11.489964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.489967 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" event={"ID":"03ff5b18-36df-4e34-93ea-5d57a4bf949b","Type":"ContainerStarted","Data":"ec7358f6ebd35ffb2304ad0e8e812d6040d5f85d64884c76f5900bdaea34ff81"} Apr 23 13:34:11.491352 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.491331 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/1.log" Apr 23 13:34:11.491745 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.491726 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/0.log" Apr 23 13:34:11.491832 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.491768 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aeb729e-46fa-42be-8d0f-9045eabfad26" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" exitCode=255 Apr 23 13:34:11.491886 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.491832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" event={"ID":"9aeb729e-46fa-42be-8d0f-9045eabfad26","Type":"ContainerDied","Data":"87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84"} Apr 23 13:34:11.491886 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.491865 2576 scope.go:117] "RemoveContainer" containerID="2bd65c12799f2e57bfadce46147a2b7e381b9b1fa4c29fa3108bab20c49db9cd" Apr 23 13:34:11.492078 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.492046 2576 scope.go:117] "RemoveContainer" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" Apr 23 13:34:11.492246 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:11.492223 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:11.505316 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:11.505271 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-thrhb" podStartSLOduration=1.505254503 podStartE2EDuration="1.505254503s" podCreationTimestamp="2026-04-23 13:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:11.504357908 +0000 UTC m=+118.048102126" watchObservedRunningTime="2026-04-23 13:34:11.505254503 +0000 UTC m=+118.048998720" Apr 23 13:34:12.494948 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:12.494918 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/1.log" Apr 23 13:34:12.495449 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:12.495364 2576 scope.go:117] "RemoveContainer" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" Apr 23 13:34:12.495554 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:12.495531 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:13.201896 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:13.201865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6kpm_e662fef7-fd2a-4a55-91de-e3ed361dab06/dns-node-resolver/0.log" Apr 23 13:34:14.402037 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:14.402012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7dgj_de702b67-cf80-4b1f-b30b-e4a459ac038e/node-ca/0.log" Apr 23 13:34:14.681295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:14.681200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:14.681473 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:14.681310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:14.681473 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:14.681340 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:14.681473 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:14.681380 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:14.681473 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:14.681409 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.681393119 +0000 UTC m=+129.225137312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:14.681686 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:14.681473 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:34:14.681686 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:14.681476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.681460255 +0000 UTC m=+129.225204460 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : configmap references non-existent config key: service-ca.crt Apr 23 13:34:14.681686 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:14.681553 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs podName:a88a928d-64c1-4284-b688-d0ec2e231c16 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:22.681537205 +0000 UTC m=+129.225281401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs") pod "router-default-6f688cbc9d-w6cb7" (UID: "a88a928d-64c1-4284-b688-d0ec2e231c16") : secret "router-metrics-certs-default" not found Apr 23 13:34:17.276267 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:17.276218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:17.276789 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:17.276285 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:17.276855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:17.276791 2576 scope.go:117] "RemoveContainer" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" Apr 23 13:34:17.277026 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:17.277004 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:22.751870 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.751823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.751880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.751910 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.751934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:22.752027 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:22.752036 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:22.752092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs podName:fde80200-8a4e-4844-91f0-ed8f18a92617 nodeName:}" failed. No retries permitted until 2026-04-23 13:36:24.752077437 +0000 UTC m=+251.295821630 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs") pod "network-metrics-daemon-wzp5m" (UID: "fde80200-8a4e-4844-91f0-ed8f18a92617") : secret "metrics-daemon-secret" not found Apr 23 13:34:22.752362 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:22.752106 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls podName:becd3753-8920-40b9-bbff-58dc7e26e9b4 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:38.752099625 +0000 UTC m=+145.295843819 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6fvr4" (UID: "becd3753-8920-40b9-bbff-58dc7e26e9b4") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:34:22.752642 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.752402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a88a928d-64c1-4284-b688-d0ec2e231c16-service-ca-bundle\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:22.754281 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.754260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a88a928d-64c1-4284-b688-d0ec2e231c16-metrics-certs\") pod \"router-default-6f688cbc9d-w6cb7\" (UID: \"a88a928d-64c1-4284-b688-d0ec2e231c16\") " pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:22.890977 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.890947 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-fhs2q\"" Apr 23 13:34:22.898681 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:22.898658 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:23.031924 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.031841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6f688cbc9d-w6cb7"] Apr 23 13:34:23.035371 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:23.035332 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88a928d_64c1_4284_b688_d0ec2e231c16.slice/crio-e26e88745a9af0d419e00bd696b8b4b3d61574b893cb338ede67661a034b8101 WatchSource:0}: Error finding container e26e88745a9af0d419e00bd696b8b4b3d61574b893cb338ede67661a034b8101: Status 404 returned error can't find the container with id e26e88745a9af0d419e00bd696b8b4b3d61574b893cb338ede67661a034b8101 Apr 23 13:34:23.521769 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.521730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" event={"ID":"a88a928d-64c1-4284-b688-d0ec2e231c16","Type":"ContainerStarted","Data":"9843a4f9699ba9b35db8c59dd2fbc8dd549ac4d168f09d5b559fba86960d5ac5"} Apr 23 13:34:23.521769 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.521770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" event={"ID":"a88a928d-64c1-4284-b688-d0ec2e231c16","Type":"ContainerStarted","Data":"e26e88745a9af0d419e00bd696b8b4b3d61574b893cb338ede67661a034b8101"} Apr 23 13:34:23.542575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.542496 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" podStartSLOduration=17.54248286 podStartE2EDuration="17.54248286s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:23.541254333 +0000 UTC m=+130.084998551" watchObservedRunningTime="2026-04-23 13:34:23.54248286 +0000 UTC m=+130.086227075" Apr 23 13:34:23.898917 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.898875 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:23.901564 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:23.901538 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:24.525165 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:24.525120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:24.526455 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:24.526429 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6f688cbc9d-w6cb7" Apr 23 13:34:29.056992 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.056958 2576 scope.go:117] "RemoveContainer" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" Apr 23 13:34:29.537909 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.537829 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:34:29.538187 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.538173 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/1.log" Apr 23 13:34:29.538232 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.538204 2576 generic.go:358] "Generic (PLEG): container finished" podID="9aeb729e-46fa-42be-8d0f-9045eabfad26" containerID="1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c" exitCode=255 Apr 23 13:34:29.538265 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.538249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" event={"ID":"9aeb729e-46fa-42be-8d0f-9045eabfad26","Type":"ContainerDied","Data":"1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c"} Apr 23 13:34:29.538297 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.538279 2576 scope.go:117] "RemoveContainer" containerID="87f13f4e809c6f670684e293a526a9f84be8d8eaf62e121496f83f8bfd99ff84" Apr 23 13:34:29.538699 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:29.538674 2576 scope.go:117] "RemoveContainer" containerID="1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c" Apr 23 13:34:29.538894 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:29.538869 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:30.541875 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:30.541849 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:34:35.956330 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.956283 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bkz2n"] Apr 23 13:34:35.959637 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.959619 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-77v7z"] Apr 23 13:34:35.959793 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.959775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:35.962249 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.962229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:35.963710 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.963693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:35.964479 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.964465 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:35.964666 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.964648 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hkznh\"" Apr 23 13:34:35.970300 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.970276 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 13:34:35.970606 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.970585 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 13:34:35.970705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.970594 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xh69c\"" Apr 23 13:34:35.977011 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.976985 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-77v7z"] Apr 23 13:34:35.986852 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:35.986817 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bkz2n"] Apr 23 13:34:36.062372 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.062341 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-849f746c6-4s62k"] Apr 23 13:34:36.065297 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.065278 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.068655 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.068630 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:34:36.068763 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.068699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:34:36.068847 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.068830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7pd9g\"" Apr 23 13:34:36.068847 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.068843 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:34:36.073932 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.073912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:34:36.080854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.080830 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-849f746c6-4s62k"] Apr 23 13:34:36.156663 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-data-volume\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.156663 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.156858 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.156858 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.156858 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-crio-socket\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.156957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcl4c\" (UniqueName: \"kubernetes.io/projected/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-api-access-mcl4c\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.156957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.156932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.258347 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.258347 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258295 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-crio-socket\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258347 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-tls\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcl4c\" (UniqueName: \"kubernetes.io/projected/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-api-access-mcl4c\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-image-registry-private-configuration\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-bound-sa-token\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-crio-socket\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-installation-pull-secrets\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258556 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-certificates\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-data-volume\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258937 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258937 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-ca-trust-extracted\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258937 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc58l\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-kube-api-access-bc58l\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.258937 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.258937 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-trusted-ca\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.259106 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.258992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.259436 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.259419 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-data-volume\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.259681 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.259664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.260891 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.260864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3dcf186b-93ff-4283-9c0b-ec05a6c706a4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-77v7z\" (UID: \"3dcf186b-93ff-4283-9c0b-ec05a6c706a4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.261015 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.260943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.267536 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.267496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcl4c\" (UniqueName: \"kubernetes.io/projected/148deae9-d0c4-4c5d-ba07-4e99ac4b8c07-kube-api-access-mcl4c\") pod \"insights-runtime-extractor-bkz2n\" (UID: \"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07\") " pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.269422 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.269404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bkz2n" Apr 23 13:34:36.275209 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.275192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" Apr 23 13:34:36.359293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-tls\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-image-registry-private-configuration\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-bound-sa-token\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-installation-pull-secrets\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-certificates\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-ca-trust-extracted\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bc58l\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-kube-api-access-bc58l\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.359738 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.359447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-trusted-ca\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.360193 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.360167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-ca-trust-extracted\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.360353 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.360333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-trusted-ca\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.360657 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.360635 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-certificates\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.362203 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.362167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-installation-pull-secrets\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.362691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.362674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-registry-tls\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.362793 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.362772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-image-registry-private-configuration\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.372020 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.371976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc58l\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-kube-api-access-bc58l\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.372020 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.371989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb-bound-sa-token\") pod \"image-registry-849f746c6-4s62k\" (UID: \"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb\") " pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.374904 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.374867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:36.400948 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.400923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bkz2n"] Apr 23 13:34:36.402940 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:36.402914 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148deae9_d0c4_4c5d_ba07_4e99ac4b8c07.slice/crio-600c47f9d47a704af9d7276e031fbb24e7686a7ecb73a3a979e0942e8440c6d3 WatchSource:0}: Error finding container 600c47f9d47a704af9d7276e031fbb24e7686a7ecb73a3a979e0942e8440c6d3: Status 404 returned error can't find the container with id 600c47f9d47a704af9d7276e031fbb24e7686a7ecb73a3a979e0942e8440c6d3 Apr 23 13:34:36.417946 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.417204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-77v7z"] Apr 23 13:34:36.422929 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:36.422898 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcf186b_93ff_4283_9c0b_ec05a6c706a4.slice/crio-a1ae98fda462444c6aa78c8aa5440f06072825fd87f3d8117852a12998857b8e WatchSource:0}: Error finding container a1ae98fda462444c6aa78c8aa5440f06072825fd87f3d8117852a12998857b8e: Status 404 returned error can't find the container with id a1ae98fda462444c6aa78c8aa5440f06072825fd87f3d8117852a12998857b8e Apr 23 13:34:36.501992 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.501959 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-849f746c6-4s62k"] Apr 23 13:34:36.505490 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:36.505460 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8b64c2_2e90_4120_aa4f_0cc5680ac7eb.slice/crio-f3a57facf952e2b4d138cdbc9537e949ce20cad8c76233e7efe0d176ce039813 WatchSource:0}: Error finding container f3a57facf952e2b4d138cdbc9537e949ce20cad8c76233e7efe0d176ce039813: Status 404 returned error can't find the container with id f3a57facf952e2b4d138cdbc9537e949ce20cad8c76233e7efe0d176ce039813 Apr 23 13:34:36.556240 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.556201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" event={"ID":"3dcf186b-93ff-4283-9c0b-ec05a6c706a4","Type":"ContainerStarted","Data":"a1ae98fda462444c6aa78c8aa5440f06072825fd87f3d8117852a12998857b8e"} Apr 23 13:34:36.557546 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.557519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkz2n" event={"ID":"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07","Type":"ContainerStarted","Data":"1d279414d976faa22751ef5c10ba091e07723601208a423914eb889252c7a2b9"} Apr 23 13:34:36.557639 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.557556 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkz2n" event={"ID":"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07","Type":"ContainerStarted","Data":"600c47f9d47a704af9d7276e031fbb24e7686a7ecb73a3a979e0942e8440c6d3"} Apr 23 13:34:36.558798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:36.558771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-849f746c6-4s62k" event={"ID":"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb","Type":"ContainerStarted","Data":"f3a57facf952e2b4d138cdbc9537e949ce20cad8c76233e7efe0d176ce039813"} Apr 23 13:34:37.277228 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.277188 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:37.277659 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.277240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:34:37.277734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.277714 2576 scope.go:117] "RemoveContainer" containerID="1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c" Apr 23 13:34:37.277979 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:37.277953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:37.562890 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.562854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" event={"ID":"3dcf186b-93ff-4283-9c0b-ec05a6c706a4","Type":"ContainerStarted","Data":"f860d0442f0205ef37ac6f6ec6126afdcfa7c19e30a2ce14a09d6ccedd221783"} Apr 23 13:34:37.564361 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.564335 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkz2n" event={"ID":"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07","Type":"ContainerStarted","Data":"2a41d09c4706e92249857df454869bcee7da4d22bbf3168e326c035f3bf99e21"} Apr 23 13:34:37.565514 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.565484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-849f746c6-4s62k" event={"ID":"9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb","Type":"ContainerStarted","Data":"3f0c0215001628d30560e61b0a9e01fcb1d35cad8e235ce34a867d09e1aedfff"} Apr 23 13:34:37.565645 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.565632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:34:37.579674 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.579630 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-77v7z" podStartSLOduration=1.628800691 podStartE2EDuration="2.579617438s" podCreationTimestamp="2026-04-23 13:34:35 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.424719329 +0000 UTC m=+142.968463522" lastFinishedPulling="2026-04-23 13:34:37.375536074 +0000 UTC m=+143.919280269" observedRunningTime="2026-04-23 13:34:37.578905765 +0000 UTC m=+144.122649981" watchObservedRunningTime="2026-04-23 13:34:37.579617438 +0000 UTC m=+144.123361652" Apr 23 13:34:37.602673 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:37.602599 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-849f746c6-4s62k" podStartSLOduration=1.602585632 podStartE2EDuration="1.602585632s" podCreationTimestamp="2026-04-23 13:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:37.601676571 +0000 UTC m=+144.145420835" watchObservedRunningTime="2026-04-23 13:34:37.602585632 +0000 UTC m=+144.146329843" Apr 23 13:34:38.778029 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:38.777983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:38.780402 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:38.780321 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/becd3753-8920-40b9-bbff-58dc7e26e9b4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6fvr4\" (UID: \"becd3753-8920-40b9-bbff-58dc7e26e9b4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:38.783630 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:38.783606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-m7ct6\"" Apr 23 13:34:38.791863 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:38.791820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" Apr 23 13:34:38.920362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:38.920334 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4"] Apr 23 13:34:38.924693 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:38.924668 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecd3753_8920_40b9_bbff_58dc7e26e9b4.slice/crio-0637bfc5bc3d0a901574bfd3d1d786e0649181aa35fe869e09504c6c46fb28fe WatchSource:0}: Error finding container 0637bfc5bc3d0a901574bfd3d1d786e0649181aa35fe869e09504c6c46fb28fe: Status 404 returned error can't find the container with id 0637bfc5bc3d0a901574bfd3d1d786e0649181aa35fe869e09504c6c46fb28fe Apr 23 13:34:39.574039 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:39.573999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bkz2n" event={"ID":"148deae9-d0c4-4c5d-ba07-4e99ac4b8c07","Type":"ContainerStarted","Data":"f7f29e3ecf224ddc0a7a1e25704c487b004c5aa0b92aa25e4eeddeb93c21410d"} Apr 23 13:34:39.575316 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:39.575284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" event={"ID":"becd3753-8920-40b9-bbff-58dc7e26e9b4","Type":"ContainerStarted","Data":"0637bfc5bc3d0a901574bfd3d1d786e0649181aa35fe869e09504c6c46fb28fe"} Apr 23 13:34:39.592785 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:39.592729 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bkz2n" podStartSLOduration=2.534315559 podStartE2EDuration="4.592712298s" podCreationTimestamp="2026-04-23 13:34:35 +0000 UTC" firstStartedPulling="2026-04-23 13:34:36.471886181 +0000 UTC m=+143.015630374" lastFinishedPulling="2026-04-23 13:34:38.530282908 +0000 UTC m=+145.074027113" observedRunningTime="2026-04-23 13:34:39.591342665 +0000 UTC m=+146.135086881" watchObservedRunningTime="2026-04-23 13:34:39.592712298 +0000 UTC m=+146.136456514" Apr 23 13:34:40.580267 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:40.579947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" event={"ID":"becd3753-8920-40b9-bbff-58dc7e26e9b4","Type":"ContainerStarted","Data":"c37568567558756338ab07752a6a7f4c7db076c428fc264b832d0c8af76cb438"} Apr 23 13:34:40.597173 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:40.597112 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6fvr4" podStartSLOduration=33.04703919 podStartE2EDuration="34.597097234s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="2026-04-23 13:34:38.926989739 +0000 UTC m=+145.470733937" lastFinishedPulling="2026-04-23 13:34:40.477047784 +0000 UTC m=+147.020791981" observedRunningTime="2026-04-23 13:34:40.596340703 +0000 UTC m=+147.140084931" watchObservedRunningTime="2026-04-23 13:34:40.597097234 +0000 UTC m=+147.140841483" Apr 23 13:34:47.403389 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.403352 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf"] Apr 23 13:34:47.408930 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.408914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.411835 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.411810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:47.411974 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.411842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:47.411974 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.411810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-227g6\"" Apr 23 13:34:47.411974 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.411821 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 13:34:47.415415 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.415038 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf"] Apr 23 13:34:47.442810 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.442784 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s7569"] Apr 23 13:34:47.446251 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.446233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.449011 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.448992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.449106 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449021 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:47.449106 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.449257 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sk7r\" (UniqueName: \"kubernetes.io/projected/db1003b5-8c79-4580-8717-41e4565a67a7-kube-api-access-7sk7r\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.449257 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449190 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:47.449257 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2z86s\"" Apr 23 13:34:47.449257 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db1003b5-8c79-4580-8717-41e4565a67a7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.449572 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.449557 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:47.549772 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-sys\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.549915 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-root\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.549915 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-textfile\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.549915 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.550023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9tbb\" (UniqueName: \"kubernetes.io/projected/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-kube-api-access-x9tbb\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.550023 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:47.549959 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 13:34:47.550023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.549982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550151 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:47.550026 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls podName:db1003b5-8c79-4580-8717-41e4565a67a7 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:48.05000968 +0000 UTC m=+154.593753872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-k2dcf" (UID: "db1003b5-8c79-4580-8717-41e4565a67a7") : secret "openshift-state-metrics-tls" not found Apr 23 13:34:47.550151 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sk7r\" (UniqueName: \"kubernetes.io/projected/db1003b5-8c79-4580-8717-41e4565a67a7-kube-api-access-7sk7r\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.550238 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-metrics-client-ca\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550238 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550238 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-wtmp\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550238 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-accelerators-collector-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.550383 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.550310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db1003b5-8c79-4580-8717-41e4565a67a7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.551035 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.551019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db1003b5-8c79-4580-8717-41e4565a67a7-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.552628 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.552612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.558481 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.558461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sk7r\" (UniqueName: \"kubernetes.io/projected/db1003b5-8c79-4580-8717-41e4565a67a7-kube-api-access-7sk7r\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:47.651131 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-textfile\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9tbb\" (UniqueName: \"kubernetes.io/projected/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-kube-api-access-x9tbb\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651182 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-metrics-client-ca\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-wtmp\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-accelerators-collector-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-sys\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:47.651367 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-root\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:47.651437 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls podName:b0f2f937-55f3-482e-9e4e-bc3bfce5a791 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:48.151414238 +0000 UTC m=+154.695158432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls") pod "node-exporter-s7569" (UID: "b0f2f937-55f3-482e-9e4e-bc3bfce5a791") : secret "node-exporter-tls" not found Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-wtmp\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-root\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-sys\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651975 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651747 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-textfile\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.651975 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.651910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-accelerators-collector-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.653467 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.653408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-metrics-client-ca\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.653605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.653586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:47.664497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:47.664471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9tbb\" (UniqueName: \"kubernetes.io/projected/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-kube-api-access-x9tbb\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:48.055171 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.055143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:48.057494 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.057463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/db1003b5-8c79-4580-8717-41e4565a67a7-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-k2dcf\" (UID: \"db1003b5-8c79-4580-8717-41e4565a67a7\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:48.061037 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.061016 2576 scope.go:117] "RemoveContainer" containerID="1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c" Apr 23 13:34:48.061214 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.061197 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-s48s8_openshift-console-operator(9aeb729e-46fa-42be-8d0f-9045eabfad26)\"" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podUID="9aeb729e-46fa-42be-8d0f-9045eabfad26" Apr 23 13:34:48.155771 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.155733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:48.158064 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.158030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b0f2f937-55f3-482e-9e4e-bc3bfce5a791-node-exporter-tls\") pod \"node-exporter-s7569\" (UID: \"b0f2f937-55f3-482e-9e4e-bc3bfce5a791\") " pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:48.317863 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.317776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" Apr 23 13:34:48.354929 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.354778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s7569" Apr 23 13:34:48.364690 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:48.364642 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0f2f937_55f3_482e_9e4e_bc3bfce5a791.slice/crio-7cc9b3de11fe8539b1167801dedc26bbaec758a8385531f1923baf634d640636 WatchSource:0}: Error finding container 7cc9b3de11fe8539b1167801dedc26bbaec758a8385531f1923baf634d640636: Status 404 returned error can't find the container with id 7cc9b3de11fe8539b1167801dedc26bbaec758a8385531f1923baf634d640636 Apr 23 13:34:48.448427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.448391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf"] Apr 23 13:34:48.451276 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:48.451249 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb1003b5_8c79_4580_8717_41e4565a67a7.slice/crio-042885395f5268026fae30409f564ce3fa19cffdb44935a3680d0b9d99ed1eca WatchSource:0}: Error finding container 042885395f5268026fae30409f564ce3fa19cffdb44935a3680d0b9d99ed1eca: Status 404 returned error can't find the container with id 042885395f5268026fae30409f564ce3fa19cffdb44935a3680d0b9d99ed1eca Apr 23 13:34:48.501263 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.501240 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:48.506859 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.506842 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.511283 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.511251 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:34:48.511929 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.511760 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:34:48.511929 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.511801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2cv4w\"" Apr 23 13:34:48.512293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:34:48.512293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:34:48.512293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512193 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:34:48.512636 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:34:48.512636 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:34:48.512636 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512569 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:34:48.512879 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.512864 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:34:48.521240 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.520964 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:48.559908 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.559876 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxssr\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.559924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560048 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.559977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560147 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560147 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560137 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560227 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560227 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560291 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560291 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560361 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560395 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560378 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560458 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.560534 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.560520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.601740 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.601710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" event={"ID":"db1003b5-8c79-4580-8717-41e4565a67a7","Type":"ContainerStarted","Data":"ee8525f0e2947ff1320316c6e19c49868a516a82b65bdfbcaa124cc0e4502f04"} Apr 23 13:34:48.601873 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.601750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" event={"ID":"db1003b5-8c79-4580-8717-41e4565a67a7","Type":"ContainerStarted","Data":"042885395f5268026fae30409f564ce3fa19cffdb44935a3680d0b9d99ed1eca"} Apr 23 13:34:48.602803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.602775 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s7569" event={"ID":"b0f2f937-55f3-482e-9e4e-bc3bfce5a791","Type":"ContainerStarted","Data":"7cc9b3de11fe8539b1167801dedc26bbaec758a8385531f1923baf634d640636"} Apr 23 13:34:48.661177 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661177 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661249 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661337 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.661398 2576 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxssr\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661433 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661980 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.661487 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls podName:9624dc8f-91a2-4736-9cff-18f5203ab449 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.161466238 +0000 UTC m=+155.705210455 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449") : secret "alertmanager-main-tls" not found Apr 23 13:34:48.661980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.661561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.661980 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.661844 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle podName:9624dc8f-91a2-4736-9cff-18f5203ab449 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:49.161824323 +0000 UTC m=+155.705568536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449") : configmap references non-existent config key: ca-bundle.crt Apr 23 13:34:48.662269 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.662246 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.662688 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.662662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.664806 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.664490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.664806 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.664766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.664806 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.664772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.665023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.664855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.665023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.664867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.665175 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.665053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.665738 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.665717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.666252 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.666229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.671036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:48.671012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxssr\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:48.867968 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.867872 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6zrlv" podUID="d6361136-0129-4e11-9891-a7117fbe5be5" Apr 23 13:34:48.871034 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:48.870987 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5s4f6" podUID="24201bdd-9893-495a-8f70-680500f3a31d" Apr 23 13:34:49.167762 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.167727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:49.167920 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.167884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:49.168997 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.168961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:49.170592 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.170568 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:49.422389 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.422318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:49.503240 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.503211 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw"] Apr 23 13:34:49.508500 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.508475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.511568 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 13:34:49.511568 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 13:34:49.511568 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 13:34:49.511796 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4j910vql563he\"" Apr 23 13:34:49.511796 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511767 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 13:34:49.511899 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511793 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-m4hvh\"" Apr 23 13:34:49.511899 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.511798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 13:34:49.517522 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.517470 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw"] Apr 23 13:34:49.561184 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.561143 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:49.564619 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:49.564588 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9624dc8f_91a2_4736_9cff_18f5203ab449.slice/crio-143ae7cb679600711225b5a41ff3b3d3e1629c58556e68c0189168627c6442de WatchSource:0}: Error finding container 143ae7cb679600711225b5a41ff3b3d3e1629c58556e68c0189168627c6442de: Status 404 returned error can't find the container with id 143ae7cb679600711225b5a41ff3b3d3e1629c58556e68c0189168627c6442de Apr 23 13:34:49.572307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-grpc-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87058acd-6bca-432b-b6aa-6f61500ac7f8-metrics-client-ca\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572535 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572588 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572635 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572682 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.572721 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.572684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jddx\" (UniqueName: \"kubernetes.io/projected/87058acd-6bca-432b-b6aa-6f61500ac7f8-kube-api-access-2jddx\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.606492 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.606452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"143ae7cb679600711225b5a41ff3b3d3e1629c58556e68c0189168627c6442de"} Apr 23 13:34:49.608274 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.608241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" event={"ID":"db1003b5-8c79-4580-8717-41e4565a67a7","Type":"ContainerStarted","Data":"c76095dfd6bc6f416ad0ab62ce775f58388c44ac1173e3371f9f3e8b5adbf142"} Apr 23 13:34:49.608425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.608275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" event={"ID":"db1003b5-8c79-4580-8717-41e4565a67a7","Type":"ContainerStarted","Data":"37eff21a06f26776242843db0baa6516d8bab7ed88a29d1db2c090ed954328b8"} Apr 23 13:34:49.609673 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.609648 2576 generic.go:358] "Generic (PLEG): container finished" podID="b0f2f937-55f3-482e-9e4e-bc3bfce5a791" containerID="bcf47d3610f1730cd46bb1a86e96eab9147eedb7eea8f7cc3d768c978eb6b201" exitCode=0 Apr 23 13:34:49.609786 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.609737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s7569" event={"ID":"b0f2f937-55f3-482e-9e4e-bc3bfce5a791","Type":"ContainerDied","Data":"bcf47d3610f1730cd46bb1a86e96eab9147eedb7eea8f7cc3d768c978eb6b201"} Apr 23 13:34:49.609830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.609793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zrlv" Apr 23 13:34:49.625458 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.625419 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-k2dcf" podStartSLOduration=1.767876472 podStartE2EDuration="2.625405074s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="2026-04-23 13:34:48.604404401 +0000 UTC m=+155.148148607" lastFinishedPulling="2026-04-23 13:34:49.461933013 +0000 UTC m=+156.005677209" observedRunningTime="2026-04-23 13:34:49.625040554 +0000 UTC m=+156.168784772" watchObservedRunningTime="2026-04-23 13:34:49.625405074 +0000 UTC m=+156.169149288" Apr 23 13:34:49.673219 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-grpc-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673345 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87058acd-6bca-432b-b6aa-6f61500ac7f8-metrics-client-ca\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673535 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673535 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673535 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jddx\" (UniqueName: \"kubernetes.io/projected/87058acd-6bca-432b-b6aa-6f61500ac7f8-kube-api-access-2jddx\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.673685 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.673657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.675174 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.674076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87058acd-6bca-432b-b6aa-6f61500ac7f8-metrics-client-ca\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.676834 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.676183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.676834 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.676444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-grpc-tls\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.676834 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.676456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.677059 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.677041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.677116 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.677089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.677116 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.677099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/87058acd-6bca-432b-b6aa-6f61500ac7f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.681782 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.681756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jddx\" (UniqueName: \"kubernetes.io/projected/87058acd-6bca-432b-b6aa-6f61500ac7f8-kube-api-access-2jddx\") pod \"thanos-querier-6ccccfc88b-ddhzw\" (UID: \"87058acd-6bca-432b-b6aa-6f61500ac7f8\") " pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.820626 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.820581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:49.944794 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:49.944769 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw"] Apr 23 13:34:49.947330 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:49.947304 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87058acd_6bca_432b_b6aa_6f61500ac7f8.slice/crio-bb466f0d9cd41ec59c35f897e5cc91e7acc1b00e811e11a3f9468d8fbcff5d68 WatchSource:0}: Error finding container bb466f0d9cd41ec59c35f897e5cc91e7acc1b00e811e11a3f9468d8fbcff5d68: Status 404 returned error can't find the container with id bb466f0d9cd41ec59c35f897e5cc91e7acc1b00e811e11a3f9468d8fbcff5d68 Apr 23 13:34:50.075163 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:34:50.075116 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wzp5m" podUID="fde80200-8a4e-4844-91f0-ed8f18a92617" Apr 23 13:34:50.615375 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:50.615287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s7569" event={"ID":"b0f2f937-55f3-482e-9e4e-bc3bfce5a791","Type":"ContainerStarted","Data":"693f5fbe24528d0b553954c2fa23639a179c995cefc2769a1c56c4e541dd581e"} Apr 23 13:34:50.615375 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:50.615333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s7569" event={"ID":"b0f2f937-55f3-482e-9e4e-bc3bfce5a791","Type":"ContainerStarted","Data":"efd40aab873ce52e09d4b56e125728ac094a1215335d3c9aff7df1a9dad0b798"} Apr 23 13:34:50.616571 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:50.616539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"bb466f0d9cd41ec59c35f897e5cc91e7acc1b00e811e11a3f9468d8fbcff5d68"} Apr 23 13:34:50.637053 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:50.636992 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s7569" podStartSLOduration=2.901523401 podStartE2EDuration="3.636976816s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="2026-04-23 13:34:48.36730875 +0000 UTC m=+154.911052946" lastFinishedPulling="2026-04-23 13:34:49.102762147 +0000 UTC m=+155.646506361" observedRunningTime="2026-04-23 13:34:50.635747605 +0000 UTC m=+157.179491821" watchObservedRunningTime="2026-04-23 13:34:50.636976816 +0000 UTC m=+157.180721031" Apr 23 13:34:51.621427 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.621390 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="096e1efc242e275cf59427a54482b7677921a122ac9bea4fd1b07704ac4d64ac" exitCode=0 Apr 23 13:34:51.621927 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.621444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"096e1efc242e275cf59427a54482b7677921a122ac9bea4fd1b07704ac4d64ac"} Apr 23 13:34:51.726741 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.726705 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-587d985fcf-ptfdp"] Apr 23 13:34:51.729713 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.729697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.732811 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.732785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-80vqbo4a9b304\"" Apr 23 13:34:51.732964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.732810 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-nz4bj\"" Apr 23 13:34:51.732964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.732810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:51.732964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.732848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 13:34:51.732964 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.732897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 13:34:51.733188 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.733173 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 13:34:51.739887 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.739866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-587d985fcf-ptfdp"] Apr 23 13:34:51.795493 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctl7q\" (UniqueName: \"kubernetes.io/projected/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-kube-api-access-ctl7q\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795633 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795633 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-audit-log\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795633 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795568 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-metrics-server-audit-profiles\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795633 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-client-certs\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795787 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-tls\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.795787 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.795692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-client-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.896907 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.896882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctl7q\" (UniqueName: \"kubernetes.io/projected/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-kube-api-access-ctl7q\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897022 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.896928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897022 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.896955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-audit-log\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897022 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.896994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-metrics-server-audit-profiles\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897180 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.897045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-client-certs\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897278 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.897256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-tls\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897340 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.897323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-client-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897390 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.897362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-audit-log\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.897746 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.897720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.898097 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.898059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-metrics-server-audit-profiles\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.899894 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.899870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-client-certs\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.899997 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.899960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-secret-metrics-server-tls\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.900223 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.900202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-client-ca-bundle\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:51.904684 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:51.904656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctl7q\" (UniqueName: \"kubernetes.io/projected/3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e-kube-api-access-ctl7q\") pod \"metrics-server-587d985fcf-ptfdp\" (UID: \"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e\") " pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:52.039839 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.039797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:34:52.170932 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.170903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-587d985fcf-ptfdp"] Apr 23 13:34:52.173028 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:52.172996 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4bd3ee_f28d_44dc_a47b_9d12d6f3945e.slice/crio-bdc5ae67255a404e8a862e1f9e6a9337e2c5a83fe6132524d0d0780d88ce67d7 WatchSource:0}: Error finding container bdc5ae67255a404e8a862e1f9e6a9337e2c5a83fe6132524d0d0780d88ce67d7: Status 404 returned error can't find the container with id bdc5ae67255a404e8a862e1f9e6a9337e2c5a83fe6132524d0d0780d88ce67d7 Apr 23 13:34:52.632183 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.632144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" event={"ID":"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e","Type":"ContainerStarted","Data":"bdc5ae67255a404e8a862e1f9e6a9337e2c5a83fe6132524d0d0780d88ce67d7"} Apr 23 13:34:52.634182 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.634152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"1d29d78ce9c3ec61ccbdfbc0cd7835b2072812a814c736a419057f1f9a654998"} Apr 23 13:34:52.634289 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.634189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"32a1fc32ce3c83b3ebefc86e7dd3cef03432e74bd0b66206e4c0e9cd99604366"} Apr 23 13:34:52.634289 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:52.634200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"d99d60771d97d4efa9f819a864256f1303347940baa1d9a559165f647614cd1f"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"45c0e3870a510d51dce02a86a2c43377b414251cc27e1eaa35389517c8ac8011"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"1cdcb90a9594536fc7499d7378078675b470b8064531fe86ce58a177051c6d27"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640425 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"a388830c5cf350c09634754b6251574cc535933503412e96ce5046cea4390fe4"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"cff1003b6e19cd35c9bded6c6ead534ba1456a1dc9c03f2a330013116ecd2798"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"fad91a5dc16fca5d9e6afc1d3a0183ad5e210aa90aaee2a88a72182f8500b8b2"} Apr 23 13:34:53.640456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.640465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerStarted","Data":"7d48946940783cac6adb3dc5ec92d7a8949e36ed8419684b19d19f6749a95a4b"} Apr 23 13:34:53.643074 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.643043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"17672bb94d91d4d0b584029fb35855d6508c296bf5c54cf12b20229d4ee84ce7"} Apr 23 13:34:53.643074 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.643076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"35a1dc7a316a9681a1175e6831b3ffa95a4d4f26cc04ad8a0480a7130f773218"} Apr 23 13:34:53.643247 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.643090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" event={"ID":"87058acd-6bca-432b-b6aa-6f61500ac7f8","Type":"ContainerStarted","Data":"8feecb342322a87cf0f44fb71e6e9f68330d4511d9376dfb8dbf7c8f6327ec0f"} Apr 23 13:34:53.643247 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.643234 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:34:53.673681 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.673631 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.274942615 podStartE2EDuration="5.673616233s" podCreationTimestamp="2026-04-23 13:34:48 +0000 UTC" firstStartedPulling="2026-04-23 13:34:49.566622148 +0000 UTC m=+156.110366343" lastFinishedPulling="2026-04-23 13:34:52.965295755 +0000 UTC m=+159.509039961" observedRunningTime="2026-04-23 13:34:53.671207231 +0000 UTC m=+160.214951446" watchObservedRunningTime="2026-04-23 13:34:53.673616233 +0000 UTC m=+160.217360447" Apr 23 13:34:53.699941 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.699882 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" podStartSLOduration=1.687679906 podStartE2EDuration="4.69986023s" podCreationTimestamp="2026-04-23 13:34:49 +0000 UTC" firstStartedPulling="2026-04-23 13:34:49.949225635 +0000 UTC m=+156.492969828" lastFinishedPulling="2026-04-23 13:34:52.96140594 +0000 UTC m=+159.505150152" observedRunningTime="2026-04-23 13:34:53.697975708 +0000 UTC m=+160.241719923" watchObservedRunningTime="2026-04-23 13:34:53.69986023 +0000 UTC m=+160.243604445" Apr 23 13:34:53.715095 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.715060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:34:53.715322 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.715303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:34:53.718093 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.718068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6361136-0129-4e11-9891-a7117fbe5be5-metrics-tls\") pod \"dns-default-6zrlv\" (UID: \"d6361136-0129-4e11-9891-a7117fbe5be5\") " pod="openshift-dns/dns-default-6zrlv" Apr 23 13:34:53.718199 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.718130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24201bdd-9893-495a-8f70-680500f3a31d-cert\") pod \"ingress-canary-5s4f6\" (UID: \"24201bdd-9893-495a-8f70-680500f3a31d\") " pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:34:53.813038 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.813008 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-svh2p\"" Apr 23 13:34:53.821526 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:53.821471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zrlv" Apr 23 13:34:54.012378 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:54.012034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zrlv"] Apr 23 13:34:54.016596 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:34:54.016333 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6361136_0129_4e11_9891_a7117fbe5be5.slice/crio-5e20ed247b6f445c08c15b3c9400319a88ae372cca0ced87512a625af9bfb600 WatchSource:0}: Error finding container 5e20ed247b6f445c08c15b3c9400319a88ae372cca0ced87512a625af9bfb600: Status 404 returned error can't find the container with id 5e20ed247b6f445c08c15b3c9400319a88ae372cca0ced87512a625af9bfb600 Apr 23 13:34:54.647930 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:54.647894 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" event={"ID":"3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e","Type":"ContainerStarted","Data":"b405d5eb15bb0c86af125157c9225dc8b8e14a9282ea4d7350e4340f97cf6ce9"} Apr 23 13:34:54.649125 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:54.649096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zrlv" event={"ID":"d6361136-0129-4e11-9891-a7117fbe5be5","Type":"ContainerStarted","Data":"5e20ed247b6f445c08c15b3c9400319a88ae372cca0ced87512a625af9bfb600"} Apr 23 13:34:54.666994 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:54.666939 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" podStartSLOduration=1.898529113 podStartE2EDuration="3.666925434s" podCreationTimestamp="2026-04-23 13:34:51 +0000 UTC" firstStartedPulling="2026-04-23 13:34:52.175326223 +0000 UTC m=+158.719070422" lastFinishedPulling="2026-04-23 13:34:53.943722547 +0000 UTC m=+160.487466743" observedRunningTime="2026-04-23 13:34:54.664975981 +0000 UTC m=+161.208720197" watchObservedRunningTime="2026-04-23 13:34:54.666925434 +0000 UTC m=+161.210669647" Apr 23 13:34:55.653401 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:55.653361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zrlv" event={"ID":"d6361136-0129-4e11-9891-a7117fbe5be5","Type":"ContainerStarted","Data":"38751a90065f40a90eed6d937c8c4ad067a4b8f8f8bc03f568182fd51acc00ce"} Apr 23 13:34:55.653916 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:55.653409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zrlv" event={"ID":"d6361136-0129-4e11-9891-a7117fbe5be5","Type":"ContainerStarted","Data":"e1e4efaf5b831b25215eecbec57d4d3768f6d31618f1971ec75fe08b6b1ad4e8"} Apr 23 13:34:55.671231 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:55.671181 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6zrlv" podStartSLOduration=129.445784195 podStartE2EDuration="2m10.671164105s" podCreationTimestamp="2026-04-23 13:32:45 +0000 UTC" firstStartedPulling="2026-04-23 13:34:54.018189304 +0000 UTC m=+160.561933497" lastFinishedPulling="2026-04-23 13:34:55.243569201 +0000 UTC m=+161.787313407" observedRunningTime="2026-04-23 13:34:55.669895161 +0000 UTC m=+162.213639387" watchObservedRunningTime="2026-04-23 13:34:55.671164105 +0000 UTC m=+162.214908319" Apr 23 13:34:56.378924 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:56.378879 2576 patch_prober.go:28] interesting pod/image-registry-849f746c6-4s62k container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:56.379085 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:56.378944 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-849f746c6-4s62k" podUID="9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:56.657251 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:56.657167 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6zrlv" Apr 23 13:34:58.573605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:58.573575 2576 patch_prober.go:28] interesting pod/image-registry-849f746c6-4s62k container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:58.573975 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:58.573623 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-849f746c6-4s62k" podUID="9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:59.654766 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:34:59.654732 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6ccccfc88b-ddhzw" Apr 23 13:35:02.056526 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.056470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:35:02.056994 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.056667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:35:02.059296 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.059277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-cqcw8\"" Apr 23 13:35:02.067312 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.067295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5s4f6" Apr 23 13:35:02.190051 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.190022 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5s4f6"] Apr 23 13:35:02.192638 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:35:02.192604 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24201bdd_9893_495a_8f70_680500f3a31d.slice/crio-ef5b31c65f677f98cb929bfacc6e757dda447342cc5fa7f6d0e2f094e167c9cf WatchSource:0}: Error finding container ef5b31c65f677f98cb929bfacc6e757dda447342cc5fa7f6d0e2f094e167c9cf: Status 404 returned error can't find the container with id ef5b31c65f677f98cb929bfacc6e757dda447342cc5fa7f6d0e2f094e167c9cf Apr 23 13:35:02.674586 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:02.674547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5s4f6" event={"ID":"24201bdd-9893-495a-8f70-680500f3a31d","Type":"ContainerStarted","Data":"ef5b31c65f677f98cb929bfacc6e757dda447342cc5fa7f6d0e2f094e167c9cf"} Apr 23 13:35:03.057361 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.057329 2576 scope.go:117] "RemoveContainer" containerID="1cc4f9b2a18e68e7e023e394b69a5e978bc02346d9f8b7d04568a8e4e30fc25c" Apr 23 13:35:03.680731 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.680701 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:35:03.680956 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.680826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" event={"ID":"9aeb729e-46fa-42be-8d0f-9045eabfad26","Type":"ContainerStarted","Data":"b0466767c08631b56079f8ca1b2ced36c45ecee3661ae81818c4e772e4cef898"} Apr 23 13:35:03.681157 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.681122 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:35:03.699847 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.699789 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" podStartSLOduration=54.985580996 podStartE2EDuration="57.69977281s" podCreationTimestamp="2026-04-23 13:34:06 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.400543338 +0000 UTC m=+113.944287534" lastFinishedPulling="2026-04-23 13:34:10.114735154 +0000 UTC m=+116.658479348" observedRunningTime="2026-04-23 13:35:03.699153671 +0000 UTC m=+170.242897886" watchObservedRunningTime="2026-04-23 13:35:03.69977281 +0000 UTC m=+170.243517024" Apr 23 13:35:03.952277 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:03.952186 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-s48s8" Apr 23 13:35:04.142671 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.142635 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7tkrn"] Apr 23 13:35:04.146213 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.146186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:04.149198 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.149154 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:35:04.149343 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.149252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:35:04.149343 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.149278 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-z9v79\"" Apr 23 13:35:04.157784 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.157758 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7tkrn"] Apr 23 13:35:04.213251 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.213148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgrq\" (UniqueName: \"kubernetes.io/projected/83cc2d44-2519-43b2-87a7-a7cae9cf2256-kube-api-access-zzgrq\") pod \"downloads-6bcc868b7-7tkrn\" (UID: \"83cc2d44-2519-43b2-87a7-a7cae9cf2256\") " pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:04.313711 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.313665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgrq\" (UniqueName: \"kubernetes.io/projected/83cc2d44-2519-43b2-87a7-a7cae9cf2256-kube-api-access-zzgrq\") pod \"downloads-6bcc868b7-7tkrn\" (UID: \"83cc2d44-2519-43b2-87a7-a7cae9cf2256\") " pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:04.323057 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.323022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgrq\" (UniqueName: \"kubernetes.io/projected/83cc2d44-2519-43b2-87a7-a7cae9cf2256-kube-api-access-zzgrq\") pod \"downloads-6bcc868b7-7tkrn\" (UID: \"83cc2d44-2519-43b2-87a7-a7cae9cf2256\") " pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:04.455165 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.455125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:04.574404 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.574370 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7tkrn"] Apr 23 13:35:04.577848 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:35:04.577816 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83cc2d44_2519_43b2_87a7_a7cae9cf2256.slice/crio-64198903109ed7e03f7822cc52690003c5971cbd9646f597bead26a3eae3e8a1 WatchSource:0}: Error finding container 64198903109ed7e03f7822cc52690003c5971cbd9646f597bead26a3eae3e8a1: Status 404 returned error can't find the container with id 64198903109ed7e03f7822cc52690003c5971cbd9646f597bead26a3eae3e8a1 Apr 23 13:35:04.684494 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.684454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7tkrn" event={"ID":"83cc2d44-2519-43b2-87a7-a7cae9cf2256","Type":"ContainerStarted","Data":"64198903109ed7e03f7822cc52690003c5971cbd9646f597bead26a3eae3e8a1"} Apr 23 13:35:04.685712 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.685678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5s4f6" event={"ID":"24201bdd-9893-495a-8f70-680500f3a31d","Type":"ContainerStarted","Data":"0f7e9822b5481d4f24f646d0dadfff4558b05abd2bde478ec596093e497d55c4"} Apr 23 13:35:04.701948 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:04.701902 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5s4f6" podStartSLOduration=137.8294363 podStartE2EDuration="2m19.701888448s" podCreationTimestamp="2026-04-23 13:32:45 +0000 UTC" firstStartedPulling="2026-04-23 13:35:02.194940169 +0000 UTC m=+168.738684362" lastFinishedPulling="2026-04-23 13:35:04.067392314 +0000 UTC m=+170.611136510" observedRunningTime="2026-04-23 13:35:04.700694912 +0000 UTC m=+171.244439150" watchObservedRunningTime="2026-04-23 13:35:04.701888448 +0000 UTC m=+171.245632662" Apr 23 13:35:06.379212 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:06.379152 2576 patch_prober.go:28] interesting pod/image-registry-849f746c6-4s62k container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:35:06.379627 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:06.379254 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-849f746c6-4s62k" podUID="9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:35:06.662841 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:06.662752 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6zrlv" Apr 23 13:35:08.574288 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:08.574257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-849f746c6-4s62k" Apr 23 13:35:12.040728 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:12.040677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:35:12.041201 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:12.040754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:35:14.648204 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.648164 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:35:14.653175 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.653087 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.657243 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:35:14.657391 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-n2wjk\"" Apr 23 13:35:14.657391 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:35:14.657391 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657265 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:35:14.657391 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657270 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:35:14.657391 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.657209 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:35:14.663810 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.663789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:35:14.814824 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.814781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.814824 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.814832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb9jt\" (UniqueName: \"kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.815058 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.814959 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.815058 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.815029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.815133 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.815063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.815185 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.815126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915779 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915779 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915979 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915979 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915979 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.915979 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.915939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb9jt\" (UniqueName: \"kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.916640 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.916586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.916778 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.916643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.916778 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.916662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.918541 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.918494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.918654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.918637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.925223 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.925197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb9jt\" (UniqueName: \"kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt\") pod \"console-756ddd9f8f-8mfgz\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:14.964227 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:14.964189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:19.906116 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:19.906086 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:35:19.913685 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:35:19.913648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod418b7654_b5f4_4bde_917b_372bcb6f865c.slice/crio-c48c43451b943a17bda52ce9b7b4cb6b0ea8a5795a8958f84748c216c5420701 WatchSource:0}: Error finding container c48c43451b943a17bda52ce9b7b4cb6b0ea8a5795a8958f84748c216c5420701: Status 404 returned error can't find the container with id c48c43451b943a17bda52ce9b7b4cb6b0ea8a5795a8958f84748c216c5420701 Apr 23 13:35:20.743378 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:20.743337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7tkrn" event={"ID":"83cc2d44-2519-43b2-87a7-a7cae9cf2256","Type":"ContainerStarted","Data":"7f72ca923617fbbb2dacd57781bcfeeafb07aaa8ae71316d353c4c6d4de39f9b"} Apr 23 13:35:20.743653 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:20.743561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:20.745309 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:20.745266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756ddd9f8f-8mfgz" event={"ID":"418b7654-b5f4-4bde-917b-372bcb6f865c","Type":"ContainerStarted","Data":"c48c43451b943a17bda52ce9b7b4cb6b0ea8a5795a8958f84748c216c5420701"} Apr 23 13:35:20.763466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:20.763408 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7tkrn" podStartSLOduration=1.4645822800000001 podStartE2EDuration="16.763393214s" podCreationTimestamp="2026-04-23 13:35:04 +0000 UTC" firstStartedPulling="2026-04-23 13:35:04.580158502 +0000 UTC m=+171.123902694" lastFinishedPulling="2026-04-23 13:35:19.878969414 +0000 UTC m=+186.422713628" observedRunningTime="2026-04-23 13:35:20.76306108 +0000 UTC m=+187.306805321" watchObservedRunningTime="2026-04-23 13:35:20.763393214 +0000 UTC m=+187.307137430" Apr 23 13:35:20.766128 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:20.766098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7tkrn" Apr 23 13:35:23.757246 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:23.757204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756ddd9f8f-8mfgz" event={"ID":"418b7654-b5f4-4bde-917b-372bcb6f865c","Type":"ContainerStarted","Data":"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b"} Apr 23 13:35:23.775642 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:23.775591 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-756ddd9f8f-8mfgz" podStartSLOduration=6.106616588 podStartE2EDuration="9.775572488s" podCreationTimestamp="2026-04-23 13:35:14 +0000 UTC" firstStartedPulling="2026-04-23 13:35:19.915740042 +0000 UTC m=+186.459484234" lastFinishedPulling="2026-04-23 13:35:23.584695923 +0000 UTC m=+190.128440134" observedRunningTime="2026-04-23 13:35:23.773933163 +0000 UTC m=+190.317677392" watchObservedRunningTime="2026-04-23 13:35:23.775572488 +0000 UTC m=+190.319316705" Apr 23 13:35:24.237339 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.237302 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:35:24.264394 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.264362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:35:24.264581 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.264521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.273441 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.273410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:35:24.304467 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304672 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304672 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304672 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304672 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304846 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmddz\" (UniqueName: \"kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.304846 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.304713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405655 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405839 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmddz\" (UniqueName: \"kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405839 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405699 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405920 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405969 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405935 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.405969 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.406052 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.405975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.406466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.406437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.406705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.406679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.406875 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.406720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.406999 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.406878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.408562 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.408538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.408687 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.408667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.413278 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.413254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmddz\" (UniqueName: \"kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz\") pod \"console-6996d648f7-r8twh\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.575775 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.575733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:24.715009 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.714941 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:35:24.719878 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:35:24.719845 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44205493_6f88_4d5c_8294_79c779b8cd9a.slice/crio-8322857db29bad9943a14bb36757b9bb1e8c58bac5c2ff32816e795df888d99a WatchSource:0}: Error finding container 8322857db29bad9943a14bb36757b9bb1e8c58bac5c2ff32816e795df888d99a: Status 404 returned error can't find the container with id 8322857db29bad9943a14bb36757b9bb1e8c58bac5c2ff32816e795df888d99a Apr 23 13:35:24.761583 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.761545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6996d648f7-r8twh" event={"ID":"44205493-6f88-4d5c-8294-79c779b8cd9a","Type":"ContainerStarted","Data":"8322857db29bad9943a14bb36757b9bb1e8c58bac5c2ff32816e795df888d99a"} Apr 23 13:35:24.965075 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.965032 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:24.965241 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.965088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:24.970527 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:24.970479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:25.767176 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:25.767130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6996d648f7-r8twh" event={"ID":"44205493-6f88-4d5c-8294-79c779b8cd9a","Type":"ContainerStarted","Data":"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801"} Apr 23 13:35:25.771663 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:25.771636 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:35:25.786773 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:25.786708 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6996d648f7-r8twh" podStartSLOduration=1.786692709 podStartE2EDuration="1.786692709s" podCreationTimestamp="2026-04-23 13:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:35:25.784855879 +0000 UTC m=+192.328600120" watchObservedRunningTime="2026-04-23 13:35:25.786692709 +0000 UTC m=+192.330436923" Apr 23 13:35:26.772128 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:26.772093 2576 generic.go:358] "Generic (PLEG): container finished" podID="0b1e6cd3-17a8-4f73-b12c-4f3725a10c29" containerID="e1237ba7d2c6aac7c1836477a77db09fb9a2809e92260d1d1f10f73e162dffcd" exitCode=0 Apr 23 13:35:26.772659 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:26.772133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" event={"ID":"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29","Type":"ContainerDied","Data":"e1237ba7d2c6aac7c1836477a77db09fb9a2809e92260d1d1f10f73e162dffcd"} Apr 23 13:35:26.772788 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:26.772672 2576 scope.go:117] "RemoveContainer" containerID="e1237ba7d2c6aac7c1836477a77db09fb9a2809e92260d1d1f10f73e162dffcd" Apr 23 13:35:27.777922 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:27.777878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-r59bz" event={"ID":"0b1e6cd3-17a8-4f73-b12c-4f3725a10c29","Type":"ContainerStarted","Data":"561c6708baa8b523295263de0420a6744a75d567aea4a2ae1df636d4cedddb3f"} Apr 23 13:35:32.045459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:32.045427 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:35:32.049398 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:32.049381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-587d985fcf-ptfdp" Apr 23 13:35:34.576738 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:34.576698 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:34.577236 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:34.576758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:34.581648 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:34.581620 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:34.805463 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:34.805436 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:35:34.849366 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:34.849290 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:35:40.819191 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:40.819158 2576 generic.go:358] "Generic (PLEG): container finished" podID="60898bb3-109a-472a-a90e-9b1a908a6d36" containerID="f7edd248bd0fab4d81e8bb70608f4dec487b9ff81aaac33af2eb46bc694b3123" exitCode=0 Apr 23 13:35:40.819598 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:40.819227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" event={"ID":"60898bb3-109a-472a-a90e-9b1a908a6d36","Type":"ContainerDied","Data":"f7edd248bd0fab4d81e8bb70608f4dec487b9ff81aaac33af2eb46bc694b3123"} Apr 23 13:35:40.819598 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:40.819562 2576 scope.go:117] "RemoveContainer" containerID="f7edd248bd0fab4d81e8bb70608f4dec487b9ff81aaac33af2eb46bc694b3123" Apr 23 13:35:41.823931 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:41.823897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-tpxdm" event={"ID":"60898bb3-109a-472a-a90e-9b1a908a6d36","Type":"ContainerStarted","Data":"65c54d70d9bba469c40c771394ab8d2344f24e5c2f83f1b240f6a41d46f44c96"} Apr 23 13:35:59.872364 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:35:59.872298 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-756ddd9f8f-8mfgz" podUID="418b7654-b5f4-4bde-917b-372bcb6f865c" containerName="console" containerID="cri-o://d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b" gracePeriod=15 Apr 23 13:36:00.141612 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.141586 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756ddd9f8f-8mfgz_418b7654-b5f4-4bde-917b-372bcb6f865c/console/0.log" Apr 23 13:36:00.141740 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.141665 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:36:00.251260 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251228 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251281 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb9jt\" (UniqueName: \"kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251331 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251377 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251411 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert\") pod \"418b7654-b5f4-4bde-917b-372bcb6f865c\" (UID: \"418b7654-b5f4-4bde-917b-372bcb6f865c\") " Apr 23 13:36:00.251858 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251829 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config" (OuterVolumeSpecName: "console-config") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.251957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251860 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca" (OuterVolumeSpecName: "service-ca") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.251957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.251911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:00.253640 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.253617 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.253711 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.253660 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt" (OuterVolumeSpecName: "kube-api-access-kb9jt") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "kube-api-access-kb9jt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:00.253749 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.253721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "418b7654-b5f4-4bde-917b-372bcb6f865c" (UID: "418b7654-b5f4-4bde-917b-372bcb6f865c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:00.352412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352368 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-service-ca\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.352412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352406 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-oauth-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.352412 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352416 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-oauth-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.352675 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352427 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/418b7654-b5f4-4bde-917b-372bcb6f865c-console-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.352675 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352437 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kb9jt\" (UniqueName: \"kubernetes.io/projected/418b7654-b5f4-4bde-917b-372bcb6f865c-kube-api-access-kb9jt\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.352675 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.352446 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/418b7654-b5f4-4bde-917b-372bcb6f865c-console-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:00.884700 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884669 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-756ddd9f8f-8mfgz_418b7654-b5f4-4bde-917b-372bcb6f865c/console/0.log" Apr 23 13:36:00.885109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884710 2576 generic.go:358] "Generic (PLEG): container finished" podID="418b7654-b5f4-4bde-917b-372bcb6f865c" containerID="d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b" exitCode=2 Apr 23 13:36:00.885109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756ddd9f8f-8mfgz" event={"ID":"418b7654-b5f4-4bde-917b-372bcb6f865c","Type":"ContainerDied","Data":"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b"} Apr 23 13:36:00.885109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-756ddd9f8f-8mfgz" event={"ID":"418b7654-b5f4-4bde-917b-372bcb6f865c","Type":"ContainerDied","Data":"c48c43451b943a17bda52ce9b7b4cb6b0ea8a5795a8958f84748c216c5420701"} Apr 23 13:36:00.885109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884777 2576 scope.go:117] "RemoveContainer" containerID="d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b" Apr 23 13:36:00.885109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.884799 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-756ddd9f8f-8mfgz" Apr 23 13:36:00.892697 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.892678 2576 scope.go:117] "RemoveContainer" containerID="d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b" Apr 23 13:36:00.892966 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:36:00.892943 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b\": container with ID starting with d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b not found: ID does not exist" containerID="d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b" Apr 23 13:36:00.893045 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.892977 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b"} err="failed to get container status \"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b\": rpc error: code = NotFound desc = could not find container \"d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b\": container with ID starting with d61c53403a3e0c09bd35c336b86009beac41827b1eb1b92fa4187b0b049a9f9b not found: ID does not exist" Apr 23 13:36:00.913013 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.912985 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:36:00.919904 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:00.919883 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-756ddd9f8f-8mfgz"] Apr 23 13:36:02.060890 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:02.060856 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="418b7654-b5f4-4bde-917b-372bcb6f865c" path="/var/lib/kubelet/pods/418b7654-b5f4-4bde-917b-372bcb6f865c/volumes" Apr 23 13:36:07.764319 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764280 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:07.764800 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764744 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="alertmanager" containerID="cri-o://7d48946940783cac6adb3dc5ec92d7a8949e36ed8419684b19d19f6749a95a4b" gracePeriod=120 Apr 23 13:36:07.764854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764798 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-metric" containerID="cri-o://1cdcb90a9594536fc7499d7378078675b470b8064531fe86ce58a177051c6d27" gracePeriod=120 Apr 23 13:36:07.764898 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764868 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy" containerID="cri-o://a388830c5cf350c09634754b6251574cc535933503412e96ce5046cea4390fe4" gracePeriod=120 Apr 23 13:36:07.764945 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764873 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="prom-label-proxy" containerID="cri-o://45c0e3870a510d51dce02a86a2c43377b414251cc27e1eaa35389517c8ac8011" gracePeriod=120 Apr 23 13:36:07.764945 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764895 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="config-reloader" containerID="cri-o://fad91a5dc16fca5d9e6afc1d3a0183ad5e210aa90aaee2a88a72182f8500b8b2" gracePeriod=120 Apr 23 13:36:07.765046 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.764871 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-web" containerID="cri-o://cff1003b6e19cd35c9bded6c6ead534ba1456a1dc9c03f2a330013116ecd2798" gracePeriod=120 Apr 23 13:36:07.913601 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913568 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="45c0e3870a510d51dce02a86a2c43377b414251cc27e1eaa35389517c8ac8011" exitCode=0 Apr 23 13:36:07.913601 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913597 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="a388830c5cf350c09634754b6251574cc535933503412e96ce5046cea4390fe4" exitCode=0 Apr 23 13:36:07.913601 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913605 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="fad91a5dc16fca5d9e6afc1d3a0183ad5e210aa90aaee2a88a72182f8500b8b2" exitCode=0 Apr 23 13:36:07.913802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913611 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="7d48946940783cac6adb3dc5ec92d7a8949e36ed8419684b19d19f6749a95a4b" exitCode=0 Apr 23 13:36:07.913802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"45c0e3870a510d51dce02a86a2c43377b414251cc27e1eaa35389517c8ac8011"} Apr 23 13:36:07.913802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"a388830c5cf350c09634754b6251574cc535933503412e96ce5046cea4390fe4"} Apr 23 13:36:07.913802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"fad91a5dc16fca5d9e6afc1d3a0183ad5e210aa90aaee2a88a72182f8500b8b2"} Apr 23 13:36:07.913802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:07.913695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"7d48946940783cac6adb3dc5ec92d7a8949e36ed8419684b19d19f6749a95a4b"} Apr 23 13:36:08.920567 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:08.920532 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="1cdcb90a9594536fc7499d7378078675b470b8064531fe86ce58a177051c6d27" exitCode=0 Apr 23 13:36:08.920567 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:08.920562 2576 generic.go:358] "Generic (PLEG): container finished" podID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerID="cff1003b6e19cd35c9bded6c6ead534ba1456a1dc9c03f2a330013116ecd2798" exitCode=0 Apr 23 13:36:08.920943 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:08.920599 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"1cdcb90a9594536fc7499d7378078675b470b8064531fe86ce58a177051c6d27"} Apr 23 13:36:08.920943 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:08.920640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"cff1003b6e19cd35c9bded6c6ead534ba1456a1dc9c03f2a330013116ecd2798"} Apr 23 13:36:09.007369 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.007341 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:09.024192 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024157 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024209 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxssr\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024230 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024267 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024302 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024330 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024373 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024400 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024498 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024557 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.024676 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.024629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db\") pod \"9624dc8f-91a2-4736-9cff-18f5203ab449\" (UID: \"9624dc8f-91a2-4736-9cff-18f5203ab449\") " Apr 23 13:36:09.025293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.025187 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:09.025293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.025214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:09.025951 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.025925 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:09.027667 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.027633 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr" (OuterVolumeSpecName: "kube-api-access-lxssr") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "kube-api-access-lxssr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:09.027769 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.027724 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out" (OuterVolumeSpecName: "config-out") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:36:09.027999 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.027956 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.028187 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.028161 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume" (OuterVolumeSpecName: "config-volume") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.028273 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.028178 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.028371 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.028348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:09.028456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.028435 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.028985 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.028962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.032620 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.032552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.043402 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.043364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config" (OuterVolumeSpecName: "web-config") pod "9624dc8f-91a2-4736-9cff-18f5203ab449" (UID: "9624dc8f-91a2-4736-9cff-18f5203ab449"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126124 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxssr\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-kube-api-access-lxssr\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126162 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126172 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-main-tls\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126182 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9624dc8f-91a2-4736-9cff-18f5203ab449-tls-assets\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126191 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-config-volume\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126199 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126208 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9624dc8f-91a2-4736-9cff-18f5203ab449-metrics-client-ca\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126218 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126220 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126228 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-cluster-tls-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126237 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-web-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126246 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9624dc8f-91a2-4736-9cff-18f5203ab449-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126255 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-alertmanager-main-db\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.126646 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.126263 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9624dc8f-91a2-4736-9cff-18f5203ab449-config-out\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:09.926277 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.926241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9624dc8f-91a2-4736-9cff-18f5203ab449","Type":"ContainerDied","Data":"143ae7cb679600711225b5a41ff3b3d3e1629c58556e68c0189168627c6442de"} Apr 23 13:36:09.926690 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.926289 2576 scope.go:117] "RemoveContainer" containerID="45c0e3870a510d51dce02a86a2c43377b414251cc27e1eaa35389517c8ac8011" Apr 23 13:36:09.926690 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.926301 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:09.936447 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.936431 2576 scope.go:117] "RemoveContainer" containerID="1cdcb90a9594536fc7499d7378078675b470b8064531fe86ce58a177051c6d27" Apr 23 13:36:09.942965 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.942948 2576 scope.go:117] "RemoveContainer" containerID="a388830c5cf350c09634754b6251574cc535933503412e96ce5046cea4390fe4" Apr 23 13:36:09.949354 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.949335 2576 scope.go:117] "RemoveContainer" containerID="cff1003b6e19cd35c9bded6c6ead534ba1456a1dc9c03f2a330013116ecd2798" Apr 23 13:36:09.951890 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.951869 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:09.956487 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.956461 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:09.956957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.956935 2576 scope.go:117] "RemoveContainer" containerID="fad91a5dc16fca5d9e6afc1d3a0183ad5e210aa90aaee2a88a72182f8500b8b2" Apr 23 13:36:09.963276 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.963255 2576 scope.go:117] "RemoveContainer" containerID="7d48946940783cac6adb3dc5ec92d7a8949e36ed8419684b19d19f6749a95a4b" Apr 23 13:36:09.969575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.969558 2576 scope.go:117] "RemoveContainer" containerID="096e1efc242e275cf59427a54482b7677921a122ac9bea4fd1b07704ac4d64ac" Apr 23 13:36:09.982070 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982048 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:09.982356 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982344 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy" Apr 23 13:36:09.982397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982358 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy" Apr 23 13:36:09.982397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982370 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-web" Apr 23 13:36:09.982397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982375 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-web" Apr 23 13:36:09.982397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982382 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-metric" Apr 23 13:36:09.982397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982388 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-metric" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982402 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="config-reloader" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982409 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="config-reloader" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982420 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="prom-label-proxy" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982425 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="prom-label-proxy" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982434 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="init-config-reloader" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982439 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="init-config-reloader" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982447 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="alertmanager" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="alertmanager" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982460 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="418b7654-b5f4-4bde-917b-372bcb6f865c" containerName="console" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982465 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="418b7654-b5f4-4bde-917b-372bcb6f865c" containerName="console" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982531 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-metric" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982544 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy" Apr 23 13:36:09.982558 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982553 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="prom-label-proxy" Apr 23 13:36:09.982913 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982564 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="418b7654-b5f4-4bde-917b-372bcb6f865c" containerName="console" Apr 23 13:36:09.982913 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982575 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="kube-rbac-proxy-web" Apr 23 13:36:09.982913 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982586 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="alertmanager" Apr 23 13:36:09.982913 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.982597 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" containerName="config-reloader" Apr 23 13:36:09.987360 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.987343 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:09.993408 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993386 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:36:09.993757 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:36:09.993757 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:36:09.993757 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:36:09.993995 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:36:09.993995 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2cv4w\"" Apr 23 13:36:09.993995 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.993822 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:36:09.994107 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.994077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:36:09.994107 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:09.994092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:36:10.000277 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.000250 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:36:10.003397 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.003376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:10.034277 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-web-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034409 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034409 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034301 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034409 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-config-out\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034409 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jbf7\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-kube-api-access-4jbf7\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034597 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034844 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.034844 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.034706 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.061225 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.061198 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9624dc8f-91a2-4736-9cff-18f5203ab449" path="/var/lib/kubelet/pods/9624dc8f-91a2-4736-9cff-18f5203ab449/volumes" Apr 23 13:36:10.135120 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135120 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135083 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135120 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135345 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jbf7\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-kube-api-access-4jbf7\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135345 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135450 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135450 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135595 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135595 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-web-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135595 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-config-out\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.135896 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.135870 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.136159 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.136133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.136323 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.136138 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd41a95f-bc17-4277-9383-5fc99b246329-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138171 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138120 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138327 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-config-volume\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138706 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138660 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138801 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138801 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.138967 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-web-config\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.139026 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.138947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.139571 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.139553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd41a95f-bc17-4277-9383-5fc99b246329-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.139800 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.139785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cd41a95f-bc17-4277-9383-5fc99b246329-config-out\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.144135 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.144113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jbf7\" (UniqueName: \"kubernetes.io/projected/cd41a95f-bc17-4277-9383-5fc99b246329-kube-api-access-4jbf7\") pod \"alertmanager-main-0\" (UID: \"cd41a95f-bc17-4277-9383-5fc99b246329\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.297670 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.297633 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:36:10.426487 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.426408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:36:10.429644 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:36:10.429607 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd41a95f_bc17_4277_9383_5fc99b246329.slice/crio-a3fafcb527c0befc89006e5f43b8193773d309391df1245c9e18e2678ac95e45 WatchSource:0}: Error finding container a3fafcb527c0befc89006e5f43b8193773d309391df1245c9e18e2678ac95e45: Status 404 returned error can't find the container with id a3fafcb527c0befc89006e5f43b8193773d309391df1245c9e18e2678ac95e45 Apr 23 13:36:10.929873 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.929835 2576 generic.go:358] "Generic (PLEG): container finished" podID="cd41a95f-bc17-4277-9383-5fc99b246329" containerID="b4726a9707ab3618ad6dbb6ccf3f32d50d55610bcc10a31caefc14876186609e" exitCode=0 Apr 23 13:36:10.930293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.929925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerDied","Data":"b4726a9707ab3618ad6dbb6ccf3f32d50d55610bcc10a31caefc14876186609e"} Apr 23 13:36:10.930293 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:10.929963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"a3fafcb527c0befc89006e5f43b8193773d309391df1245c9e18e2678ac95e45"} Apr 23 13:36:11.753982 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.753946 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-65cdfd779-glvmv"] Apr 23 13:36:11.757477 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.757454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.760285 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760252 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 13:36:11.760414 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760321 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 13:36:11.760414 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 13:36:11.760599 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 13:36:11.760677 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760646 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-kmn4d\"" Apr 23 13:36:11.760902 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.760882 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 13:36:11.766413 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.766386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 13:36:11.771611 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.771581 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65cdfd779-glvmv"] Apr 23 13:36:11.851790 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.851753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.851790 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.851791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.851989 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.851900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.851989 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.851942 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-serving-certs-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.852055 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.851995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.852055 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.852016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-metrics-client-ca\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.852114 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.852067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p5lf\" (UniqueName: \"kubernetes.io/projected/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-kube-api-access-7p5lf\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.852114 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.852089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-federate-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.937096 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"62a912bc3b3d314d310d8f4cdff0b037c3b99d82e0e80c170df3ad77259a7f65"} Apr 23 13:36:11.937096 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"354fbf3845c4675a187a52f7305cfeb9599f39072460d30d8e2192b29d62d861"} Apr 23 13:36:11.937523 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937107 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"cb4016e8e7c6040536596bafc6fb0935c862fc2cd7490470a08686d493b9cceb"} Apr 23 13:36:11.937523 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"36f355355f353c2ed09110022c8cb82929c1b26f7c80a1bb855c4d5d6fb01af7"} Apr 23 13:36:11.937523 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"35dcaf895a364b970c4dc6ea68d546cf7e9e4a8d123c787d5b08360b431b77df"} Apr 23 13:36:11.937523 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.937131 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cd41a95f-bc17-4277-9383-5fc99b246329","Type":"ContainerStarted","Data":"936b8065d2eb081a3016252799fd1b39f66c3fe871d0214ab1a762c20ed3277e"} Apr 23 13:36:11.952789 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.952754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.952952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.952803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-serving-certs-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.952952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.952839 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.952952 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.952863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-metrics-client-ca\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953064 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.952954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7p5lf\" (UniqueName: \"kubernetes.io/projected/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-kube-api-access-7p5lf\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953064 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-federate-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953164 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953164 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953723 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-serving-certs-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-metrics-client-ca\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.953854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.953813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-trusted-ca-bundle\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.955594 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.955566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-federate-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.955675 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.955656 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-telemeter-client-tls\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.955830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.955814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.955872 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.955816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-secret-telemeter-client\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:11.961010 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.960972 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.960962222 podStartE2EDuration="2.960962222s" podCreationTimestamp="2026-04-23 13:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:11.960219245 +0000 UTC m=+238.503963495" watchObservedRunningTime="2026-04-23 13:36:11.960962222 +0000 UTC m=+238.504706437" Apr 23 13:36:11.964522 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:11.964490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p5lf\" (UniqueName: \"kubernetes.io/projected/fc0d4cb9-f2b6-4e44-b6d4-49011df8f021-kube-api-access-7p5lf\") pod \"telemeter-client-65cdfd779-glvmv\" (UID: \"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021\") " pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:12.072592 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:12.072562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" Apr 23 13:36:12.194567 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:12.194532 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-65cdfd779-glvmv"] Apr 23 13:36:12.197672 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:36:12.197643 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0d4cb9_f2b6_4e44_b6d4_49011df8f021.slice/crio-53730a40b14810f1e454cd344de579a5f3d8a632bf4358b55b6f3a881f4be199 WatchSource:0}: Error finding container 53730a40b14810f1e454cd344de579a5f3d8a632bf4358b55b6f3a881f4be199: Status 404 returned error can't find the container with id 53730a40b14810f1e454cd344de579a5f3d8a632bf4358b55b6f3a881f4be199 Apr 23 13:36:12.944197 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:12.944153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" event={"ID":"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021","Type":"ContainerStarted","Data":"53730a40b14810f1e454cd344de579a5f3d8a632bf4358b55b6f3a881f4be199"} Apr 23 13:36:14.951873 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:14.951828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" event={"ID":"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021","Type":"ContainerStarted","Data":"f7f5aec35b329e9c05b54bd8ebfdd8a8f8bb6e8a51bd7e66eeabcb288a688f7b"} Apr 23 13:36:14.951873 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:14.951874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" event={"ID":"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021","Type":"ContainerStarted","Data":"d0547312b371c04d12b0aa45bedd889aef5ff24b25eef7ac4a1b4423b46559a2"} Apr 23 13:36:14.952308 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:14.951885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" event={"ID":"fc0d4cb9-f2b6-4e44-b6d4-49011df8f021","Type":"ContainerStarted","Data":"fbb350c897cd50d54b7c4a1473b00716646dab4f02d2f0b2ccc40e4cbc10f107"} Apr 23 13:36:14.976416 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:14.976368 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-65cdfd779-glvmv" podStartSLOduration=2.051649367 podStartE2EDuration="3.976351736s" podCreationTimestamp="2026-04-23 13:36:11 +0000 UTC" firstStartedPulling="2026-04-23 13:36:12.199544985 +0000 UTC m=+238.743289182" lastFinishedPulling="2026-04-23 13:36:14.12424735 +0000 UTC m=+240.667991551" observedRunningTime="2026-04-23 13:36:14.97443533 +0000 UTC m=+241.518179579" watchObservedRunningTime="2026-04-23 13:36:14.976351736 +0000 UTC m=+241.520095949" Apr 23 13:36:15.681934 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.681892 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:36:15.685467 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.685439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.694922 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.694895 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:36:15.793666 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.793855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.793855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24qh\" (UniqueName: \"kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.793855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793795 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.793855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.794053 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793866 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.794053 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.793891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895396 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895396 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895485 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24qh\" (UniqueName: \"kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.895691 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.895585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.896342 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.896315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.896461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.896325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.896461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.896387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.896614 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.896471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.898074 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.898046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.898186 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.898056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.903595 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.903572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24qh\" (UniqueName: \"kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh\") pod \"console-5cd7dc59bf-2hhjh\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:15.996621 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:15.996496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:16.116805 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:16.116774 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:36:16.119898 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:36:16.119867 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ac7e4d_338f_421f_9821_c57d4859344b.slice/crio-32b97b0f5c9d6a59363e608edd9c0ca264f1dd1f08f212e470e2530e6e946453 WatchSource:0}: Error finding container 32b97b0f5c9d6a59363e608edd9c0ca264f1dd1f08f212e470e2530e6e946453: Status 404 returned error can't find the container with id 32b97b0f5c9d6a59363e608edd9c0ca264f1dd1f08f212e470e2530e6e946453 Apr 23 13:36:16.959406 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:16.959369 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd7dc59bf-2hhjh" event={"ID":"03ac7e4d-338f-421f-9821-c57d4859344b","Type":"ContainerStarted","Data":"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba"} Apr 23 13:36:16.959406 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:16.959403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd7dc59bf-2hhjh" event={"ID":"03ac7e4d-338f-421f-9821-c57d4859344b","Type":"ContainerStarted","Data":"32b97b0f5c9d6a59363e608edd9c0ca264f1dd1f08f212e470e2530e6e946453"} Apr 23 13:36:16.977589 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:16.977541 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cd7dc59bf-2hhjh" podStartSLOduration=1.977526916 podStartE2EDuration="1.977526916s" podCreationTimestamp="2026-04-23 13:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:36:16.976796695 +0000 UTC m=+243.520540909" watchObservedRunningTime="2026-04-23 13:36:16.977526916 +0000 UTC m=+243.521271128" Apr 23 13:36:24.777532 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:24.777461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:36:24.779749 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:24.779715 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fde80200-8a4e-4844-91f0-ed8f18a92617-metrics-certs\") pod \"network-metrics-daemon-wzp5m\" (UID: \"fde80200-8a4e-4844-91f0-ed8f18a92617\") " pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:36:24.859984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:24.859954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lj9pj\"" Apr 23 13:36:24.867082 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:24.867061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wzp5m" Apr 23 13:36:24.990949 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:24.990924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wzp5m"] Apr 23 13:36:24.992895 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:36:24.992861 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfde80200_8a4e_4844_91f0_ed8f18a92617.slice/crio-8a12940ea975e94702a167f38b40acd1fb3068f223706e2b2b368d6683ea1a92 WatchSource:0}: Error finding container 8a12940ea975e94702a167f38b40acd1fb3068f223706e2b2b368d6683ea1a92: Status 404 returned error can't find the container with id 8a12940ea975e94702a167f38b40acd1fb3068f223706e2b2b368d6683ea1a92 Apr 23 13:36:25.986410 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:25.986381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wzp5m" event={"ID":"fde80200-8a4e-4844-91f0-ed8f18a92617","Type":"ContainerStarted","Data":"8a12940ea975e94702a167f38b40acd1fb3068f223706e2b2b368d6683ea1a92"} Apr 23 13:36:25.997174 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:25.997148 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:25.997283 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:25.997212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:26.005155 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:26.005123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:26.990803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:26.990763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wzp5m" event={"ID":"fde80200-8a4e-4844-91f0-ed8f18a92617","Type":"ContainerStarted","Data":"7de9a5c4de6a1521c55b00cc3fc4e5dc78c4cfee7f7b4c1235f32f828f082425"} Apr 23 13:36:26.990803 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:26.990807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wzp5m" event={"ID":"fde80200-8a4e-4844-91f0-ed8f18a92617","Type":"ContainerStarted","Data":"1744ee17461d1c605c18b08db7c5ff4ae6c3fa68b6dc73edb60ad094040a13e8"} Apr 23 13:36:26.994596 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:26.994567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:36:27.007804 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:27.007752 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wzp5m" podStartSLOduration=252.098398564 podStartE2EDuration="4m13.007738631s" podCreationTimestamp="2026-04-23 13:32:14 +0000 UTC" firstStartedPulling="2026-04-23 13:36:24.994834727 +0000 UTC m=+251.538578920" lastFinishedPulling="2026-04-23 13:36:25.90417479 +0000 UTC m=+252.447918987" observedRunningTime="2026-04-23 13:36:27.005844955 +0000 UTC m=+253.549589170" watchObservedRunningTime="2026-04-23 13:36:27.007738631 +0000 UTC m=+253.551482896" Apr 23 13:36:27.054346 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:27.054313 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:36:52.075731 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.075674 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6996d648f7-r8twh" podUID="44205493-6f88-4d5c-8294-79c779b8cd9a" containerName="console" containerID="cri-o://0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801" gracePeriod=15 Apr 23 13:36:52.327934 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.327870 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6996d648f7-r8twh_44205493-6f88-4d5c-8294-79c779b8cd9a/console/0.log" Apr 23 13:36:52.328047 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.327934 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:36:52.412087 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412054 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412285 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412098 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmddz\" (UniqueName: \"kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412285 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412129 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412285 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412255 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412308 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412348 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412461 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412397 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config\") pod \"44205493-6f88-4d5c-8294-79c779b8cd9a\" (UID: \"44205493-6f88-4d5c-8294-79c779b8cd9a\") " Apr 23 13:36:52.412661 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412552 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca" (OuterVolumeSpecName: "service-ca") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:52.412661 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412561 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:52.412766 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412730 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-trusted-ca-bundle\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.412766 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412743 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-service-ca\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.412861 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412786 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:52.412972 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.412930 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config" (OuterVolumeSpecName: "console-config") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:52.414439 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.414404 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:52.414545 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.414452 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:52.414545 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.414466 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz" (OuterVolumeSpecName: "kube-api-access-kmddz") pod "44205493-6f88-4d5c-8294-79c779b8cd9a" (UID: "44205493-6f88-4d5c-8294-79c779b8cd9a"). InnerVolumeSpecName "kube-api-access-kmddz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:52.513138 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.513084 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmddz\" (UniqueName: \"kubernetes.io/projected/44205493-6f88-4d5c-8294-79c779b8cd9a-kube-api-access-kmddz\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.513138 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.513130 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-oauth-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.513138 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.513141 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/44205493-6f88-4d5c-8294-79c779b8cd9a-console-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.513138 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.513153 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-oauth-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:52.513399 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:52.513162 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/44205493-6f88-4d5c-8294-79c779b8cd9a-console-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:36:53.071265 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071235 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6996d648f7-r8twh_44205493-6f88-4d5c-8294-79c779b8cd9a/console/0.log" Apr 23 13:36:53.071451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071281 2576 generic.go:358] "Generic (PLEG): container finished" podID="44205493-6f88-4d5c-8294-79c779b8cd9a" containerID="0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801" exitCode=2 Apr 23 13:36:53.071451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071347 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6996d648f7-r8twh" Apr 23 13:36:53.071451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6996d648f7-r8twh" event={"ID":"44205493-6f88-4d5c-8294-79c779b8cd9a","Type":"ContainerDied","Data":"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801"} Apr 23 13:36:53.071451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6996d648f7-r8twh" event={"ID":"44205493-6f88-4d5c-8294-79c779b8cd9a","Type":"ContainerDied","Data":"8322857db29bad9943a14bb36757b9bb1e8c58bac5c2ff32816e795df888d99a"} Apr 23 13:36:53.071451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.071415 2576 scope.go:117] "RemoveContainer" containerID="0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801" Apr 23 13:36:53.080284 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.080110 2576 scope.go:117] "RemoveContainer" containerID="0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801" Apr 23 13:36:53.080600 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:36:53.080395 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801\": container with ID starting with 0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801 not found: ID does not exist" containerID="0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801" Apr 23 13:36:53.080600 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.080419 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801"} err="failed to get container status \"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801\": rpc error: code = NotFound desc = could not find container \"0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801\": container with ID starting with 0b6074b0f21090b46398099cc6be650089a31022e73279051040846cc6c2e801 not found: ID does not exist" Apr 23 13:36:53.092232 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.092202 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:36:53.098555 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:53.098528 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6996d648f7-r8twh"] Apr 23 13:36:54.063057 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:36:54.063026 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44205493-6f88-4d5c-8294-79c779b8cd9a" path="/var/lib/kubelet/pods/44205493-6f88-4d5c-8294-79c779b8cd9a/volumes" Apr 23 13:37:13.933136 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:13.933104 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:37:13.933764 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:13.933268 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:37:13.942156 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:13.942134 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:44.795382 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.795337 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:37:44.795911 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.795749 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44205493-6f88-4d5c-8294-79c779b8cd9a" containerName="console" Apr 23 13:37:44.795911 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.795765 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="44205493-6f88-4d5c-8294-79c779b8cd9a" containerName="console" Apr 23 13:37:44.795911 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.795818 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="44205493-6f88-4d5c-8294-79c779b8cd9a" containerName="console" Apr 23 13:37:44.798660 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.798640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.812752 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.812724 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:37:44.863975 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.863936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.863975 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.863976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.864307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.863993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f449\" (UniqueName: \"kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.864307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.864027 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.864307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.864095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.864307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.864133 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.864307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.864224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965435 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f449\" (UniqueName: \"kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.965705 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.965628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.966356 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.966330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.966456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.966363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.966456 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.966425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.966561 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.966425 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.968474 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.968453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.968585 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.968453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:44.973604 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:44.973577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f449\" (UniqueName: \"kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449\") pod \"console-6c9b88cbd8-bgvcf\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:45.109422 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:45.109335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:45.230166 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:45.230003 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:37:45.232876 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:37:45.232846 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5af15ae_e314_4f48_9b7a_82602b85d57a.slice/crio-43a0b9b1a61c160435f3017065c677d7cf1e921930b4960c80412d928b44d32c WatchSource:0}: Error finding container 43a0b9b1a61c160435f3017065c677d7cf1e921930b4960c80412d928b44d32c: Status 404 returned error can't find the container with id 43a0b9b1a61c160435f3017065c677d7cf1e921930b4960c80412d928b44d32c Apr 23 13:37:45.234574 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:45.234558 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:37:46.232698 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:46.232662 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c9b88cbd8-bgvcf" event={"ID":"e5af15ae-e314-4f48-9b7a-82602b85d57a","Type":"ContainerStarted","Data":"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd"} Apr 23 13:37:46.233109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:46.232698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c9b88cbd8-bgvcf" event={"ID":"e5af15ae-e314-4f48-9b7a-82602b85d57a","Type":"ContainerStarted","Data":"43a0b9b1a61c160435f3017065c677d7cf1e921930b4960c80412d928b44d32c"} Apr 23 13:37:46.254065 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:46.254017 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c9b88cbd8-bgvcf" podStartSLOduration=2.254002753 podStartE2EDuration="2.254002753s" podCreationTimestamp="2026-04-23 13:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:37:46.252203582 +0000 UTC m=+332.795947797" watchObservedRunningTime="2026-04-23 13:37:46.254002753 +0000 UTC m=+332.797746967" Apr 23 13:37:55.110046 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:55.110005 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:55.110046 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:55.110052 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:55.114891 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:55.114867 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:55.264232 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:55.264199 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:37:55.311800 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:37:55.311765 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:38:19.101645 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.101609 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-v6hlj"] Apr 23 13:38:19.105051 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.105032 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.108655 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.108634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:38:19.114657 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.114632 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-v6hlj"] Apr 23 13:38:19.266324 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.266286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-kubelet-config\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.266490 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.266338 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-original-pull-secret\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.266490 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.266399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-dbus\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.367855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.367758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-dbus\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.368036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.367870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-kubelet-config\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.368036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.367912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-original-pull-secret\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.368036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.367951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-dbus\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.368036 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.367986 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-kubelet-config\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.370400 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.370375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/db5c8cc0-562a-4b09-9003-11f2a78bd2a6-original-pull-secret\") pod \"global-pull-secret-syncer-v6hlj\" (UID: \"db5c8cc0-562a-4b09-9003-11f2a78bd2a6\") " pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.414225 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.414189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hlj" Apr 23 13:38:19.533587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:19.533404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-v6hlj"] Apr 23 13:38:19.536309 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:38:19.536284 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb5c8cc0_562a_4b09_9003_11f2a78bd2a6.slice/crio-b9ff0a3ff9c327ebef36c5021d38e55e60f5d630e4c91a88ca781faf5f694866 WatchSource:0}: Error finding container b9ff0a3ff9c327ebef36c5021d38e55e60f5d630e4c91a88ca781faf5f694866: Status 404 returned error can't find the container with id b9ff0a3ff9c327ebef36c5021d38e55e60f5d630e4c91a88ca781faf5f694866 Apr 23 13:38:20.332696 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.332638 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cd7dc59bf-2hhjh" podUID="03ac7e4d-338f-421f-9821-c57d4859344b" containerName="console" containerID="cri-o://0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba" gracePeriod=15 Apr 23 13:38:20.337829 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.337782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-v6hlj" event={"ID":"db5c8cc0-562a-4b09-9003-11f2a78bd2a6","Type":"ContainerStarted","Data":"b9ff0a3ff9c327ebef36c5021d38e55e60f5d630e4c91a88ca781faf5f694866"} Apr 23 13:38:20.591294 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.591223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cd7dc59bf-2hhjh_03ac7e4d-338f-421f-9821-c57d4859344b/console/0.log" Apr 23 13:38:20.591432 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.591296 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:38:20.681362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681320 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g24qh\" (UniqueName: \"kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681577 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681414 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681577 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681449 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681577 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681491 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681577 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681785 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.681785 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.681615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert\") pod \"03ac7e4d-338f-421f-9821-c57d4859344b\" (UID: \"03ac7e4d-338f-421f-9821-c57d4859344b\") " Apr 23 13:38:20.682369 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.682328 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:38:20.682369 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.682339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config" (OuterVolumeSpecName: "console-config") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:38:20.682575 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.682348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:38:20.682641 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.682617 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca" (OuterVolumeSpecName: "service-ca") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:38:20.684332 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.684308 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh" (OuterVolumeSpecName: "kube-api-access-g24qh") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "kube-api-access-g24qh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:38:20.684491 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.684467 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:38:20.684740 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.684709 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "03ac7e4d-338f-421f-9821-c57d4859344b" (UID: "03ac7e4d-338f-421f-9821-c57d4859344b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:38:20.783301 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783265 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-trusted-ca-bundle\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783301 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783302 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-oauth-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783316 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-oauth-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783329 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/03ac7e4d-338f-421f-9821-c57d4859344b-console-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783341 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g24qh\" (UniqueName: \"kubernetes.io/projected/03ac7e4d-338f-421f-9821-c57d4859344b-kube-api-access-g24qh\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783353 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-console-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:20.783497 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:20.783367 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/03ac7e4d-338f-421f-9821-c57d4859344b-service-ca\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:21.342569 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342537 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cd7dc59bf-2hhjh_03ac7e4d-338f-421f-9821-c57d4859344b/console/0.log" Apr 23 13:38:21.342984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342584 2576 generic.go:358] "Generic (PLEG): container finished" podID="03ac7e4d-338f-421f-9821-c57d4859344b" containerID="0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba" exitCode=2 Apr 23 13:38:21.342984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342669 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cd7dc59bf-2hhjh" Apr 23 13:38:21.342984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd7dc59bf-2hhjh" event={"ID":"03ac7e4d-338f-421f-9821-c57d4859344b","Type":"ContainerDied","Data":"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba"} Apr 23 13:38:21.342984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cd7dc59bf-2hhjh" event={"ID":"03ac7e4d-338f-421f-9821-c57d4859344b","Type":"ContainerDied","Data":"32b97b0f5c9d6a59363e608edd9c0ca264f1dd1f08f212e470e2530e6e946453"} Apr 23 13:38:21.342984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.342744 2576 scope.go:117] "RemoveContainer" containerID="0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba" Apr 23 13:38:21.352295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.352276 2576 scope.go:117] "RemoveContainer" containerID="0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba" Apr 23 13:38:21.352680 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:38:21.352648 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba\": container with ID starting with 0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba not found: ID does not exist" containerID="0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba" Apr 23 13:38:21.352818 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.352684 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba"} err="failed to get container status \"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba\": rpc error: code = NotFound desc = could not find container \"0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba\": container with ID starting with 0d155f8c32e3bf6abc63f4d2713650b09f6ab71b4e5b528207f5471bcb6c02ba not found: ID does not exist" Apr 23 13:38:21.368233 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.368206 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:38:21.371188 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:21.371167 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cd7dc59bf-2hhjh"] Apr 23 13:38:22.061577 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:22.061544 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ac7e4d-338f-421f-9821-c57d4859344b" path="/var/lib/kubelet/pods/03ac7e4d-338f-421f-9821-c57d4859344b/volumes" Apr 23 13:38:24.356455 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:24.356420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-v6hlj" event={"ID":"db5c8cc0-562a-4b09-9003-11f2a78bd2a6","Type":"ContainerStarted","Data":"705ba330b55a53f52ca536fa638807f657da0cfc44cb6ecfdd675870ff65d98b"} Apr 23 13:38:24.386747 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:24.386698 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-v6hlj" podStartSLOduration=1.423308602 podStartE2EDuration="5.386682702s" podCreationTimestamp="2026-04-23 13:38:19 +0000 UTC" firstStartedPulling="2026-04-23 13:38:19.53791584 +0000 UTC m=+366.081660033" lastFinishedPulling="2026-04-23 13:38:23.501289926 +0000 UTC m=+370.045034133" observedRunningTime="2026-04-23 13:38:24.385792473 +0000 UTC m=+370.929536687" watchObservedRunningTime="2026-04-23 13:38:24.386682702 +0000 UTC m=+370.930426917" Apr 23 13:38:40.975163 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.975127 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz"] Apr 23 13:38:40.975573 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.975476 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03ac7e4d-338f-421f-9821-c57d4859344b" containerName="console" Apr 23 13:38:40.975573 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.975486 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ac7e4d-338f-421f-9821-c57d4859344b" containerName="console" Apr 23 13:38:40.975573 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.975572 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="03ac7e4d-338f-421f-9821-c57d4859344b" containerName="console" Apr 23 13:38:40.978801 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.978785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:40.982105 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.982086 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-qvnc9\"" Apr 23 13:38:40.982452 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.982433 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 23 13:38:40.983223 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.983206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 23 13:38:40.987729 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:40.987703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz"] Apr 23 13:38:41.063139 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.063103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.063309 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.063156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn5hd\" (UniqueName: \"kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.063309 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.063236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.163985 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.163944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.164165 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.164022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jn5hd\" (UniqueName: \"kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.164165 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.164079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.164438 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.164416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.164438 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.164431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.173764 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.173738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn5hd\" (UniqueName: \"kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.287994 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.287913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:41.414146 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:41.414116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz"] Apr 23 13:38:41.416989 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:38:41.416958 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eba0a3d_eabc_4dfc_8cb9_3491c9835c2e.slice/crio-f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c WatchSource:0}: Error finding container f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c: Status 404 returned error can't find the container with id f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c Apr 23 13:38:42.417396 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:42.417290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerStarted","Data":"f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c"} Apr 23 13:38:47.435440 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:47.435405 2576 generic.go:358] "Generic (PLEG): container finished" podID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerID="6829a756f08af8ceeb46da37a253b691d8b972818120e5c553a6d92ced38a967" exitCode=0 Apr 23 13:38:47.435876 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:47.435487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerDied","Data":"6829a756f08af8ceeb46da37a253b691d8b972818120e5c553a6d92ced38a967"} Apr 23 13:38:50.446404 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:50.446366 2576 generic.go:358] "Generic (PLEG): container finished" podID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerID="e11b37f0167b96a28f53ad22f3515ab657038711da2b1ccbe43246a63d7b9e6d" exitCode=0 Apr 23 13:38:50.446874 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:50.446435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerDied","Data":"e11b37f0167b96a28f53ad22f3515ab657038711da2b1ccbe43246a63d7b9e6d"} Apr 23 13:38:56.469105 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:56.469067 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerStarted","Data":"ac3ba88db8c6693d4f746e97f78b9c91800dd832f7643cb9f46c16ce6ee12acd"} Apr 23 13:38:56.487778 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:56.487717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" podStartSLOduration=1.588015253 podStartE2EDuration="16.48769881s" podCreationTimestamp="2026-04-23 13:38:40 +0000 UTC" firstStartedPulling="2026-04-23 13:38:41.418727461 +0000 UTC m=+387.962471653" lastFinishedPulling="2026-04-23 13:38:56.318411006 +0000 UTC m=+402.862155210" observedRunningTime="2026-04-23 13:38:56.485770722 +0000 UTC m=+403.029514936" watchObservedRunningTime="2026-04-23 13:38:56.48769881 +0000 UTC m=+403.031443028" Apr 23 13:38:57.479124 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:57.479091 2576 generic.go:358] "Generic (PLEG): container finished" podID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerID="ac3ba88db8c6693d4f746e97f78b9c91800dd832f7643cb9f46c16ce6ee12acd" exitCode=0 Apr 23 13:38:57.479535 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:57.479192 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerDied","Data":"ac3ba88db8c6693d4f746e97f78b9c91800dd832f7643cb9f46c16ce6ee12acd"} Apr 23 13:38:58.612092 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.612068 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:58.727567 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.727528 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle\") pod \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " Apr 23 13:38:58.727745 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.727579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util\") pod \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " Apr 23 13:38:58.727745 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.727658 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn5hd\" (UniqueName: \"kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd\") pod \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\" (UID: \"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e\") " Apr 23 13:38:58.728111 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.728088 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle" (OuterVolumeSpecName: "bundle") pod "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" (UID: "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:38:58.729830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.729799 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd" (OuterVolumeSpecName: "kube-api-access-jn5hd") pod "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" (UID: "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e"). InnerVolumeSpecName "kube-api-access-jn5hd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:38:58.731544 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.731523 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util" (OuterVolumeSpecName: "util") pod "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" (UID: "6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:38:58.829001 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.828962 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jn5hd\" (UniqueName: \"kubernetes.io/projected/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-kube-api-access-jn5hd\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:58.829001 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.828998 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-bundle\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:58.829001 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:58.829008 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e-util\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:38:59.486997 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:59.486970 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" Apr 23 13:38:59.487143 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:59.486969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c445xz" event={"ID":"6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e","Type":"ContainerDied","Data":"f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c"} Apr 23 13:38:59.487143 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:38:59.487081 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61a96bbb3914e03176b19b1c3c3c7a17862287dcd64078070ab65f37f1cae3c" Apr 23 13:39:07.404349 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d796l"] Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404660 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="pull" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404671 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="pull" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="util" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404693 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="util" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404706 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="extract" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404712 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="extract" Apr 23 13:39:07.404768 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.404764 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6eba0a3d-eabc-4dfc-8cb9-3491c9835c2e" containerName="extract" Apr 23 13:39:07.408093 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.408079 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.411206 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.411184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:39:07.412145 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.412123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:39:07.412264 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.412175 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 23 13:39:07.412264 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.412126 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:39:07.412264 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.412130 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:39:07.412392 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.412131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-xtz6q\"" Apr 23 13:39:07.418833 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.418813 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d796l"] Apr 23 13:39:07.500226 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.500184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.500226 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.500235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd0325e5-5b61-43c7-930d-7380f8740cd9-cabundle0\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.500453 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.500263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk4hj\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-kube-api-access-nk4hj\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.601040 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.601002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd0325e5-5b61-43c7-930d-7380f8740cd9-cabundle0\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.601040 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.601042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nk4hj\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-kube-api-access-nk4hj\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.601253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.601107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.601253 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.601202 2576 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 23 13:39:07.601253 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.601227 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:39:07.601253 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.601234 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:39:07.601253 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.601246 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d796l: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 13:39:07.601413 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.601301 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates podName:dd0325e5-5b61-43c7-930d-7380f8740cd9 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:08.10128658 +0000 UTC m=+414.645030773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates") pod "keda-operator-ffbb595cb-d796l" (UID: "dd0325e5-5b61-43c7-930d-7380f8740cd9") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 23 13:39:07.601662 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.601643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/dd0325e5-5b61-43c7-930d-7380f8740cd9-cabundle0\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.611280 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.611248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk4hj\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-kube-api-access-nk4hj\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:07.722253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.722168 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws"] Apr 23 13:39:07.725466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.725448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.728364 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.728341 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 23 13:39:07.742732 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.742702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws"] Apr 23 13:39:07.802671 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.802637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6413eea6-cd29-4f1e-a91f-83c778171097-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.802854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.802685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxfs\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-kube-api-access-cxxfs\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.802854 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.802768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.903997 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.903962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6413eea6-cd29-4f1e-a91f-83c778171097-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.904144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.904027 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxfs\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-kube-api-access-cxxfs\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.904144 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.904138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.904308 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.904290 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:39:07.904352 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.904314 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:39:07.904352 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.904337 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws: references non-existent secret key: tls.crt Apr 23 13:39:07.904425 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.904363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/6413eea6-cd29-4f1e-a91f-83c778171097-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.904425 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:07.904404 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates podName:6413eea6-cd29-4f1e-a91f-83c778171097 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:08.404386503 +0000 UTC m=+414.948130709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates") pod "keda-metrics-apiserver-7c9f485588-qnfws" (UID: "6413eea6-cd29-4f1e-a91f-83c778171097") : references non-existent secret key: tls.crt Apr 23 13:39:07.916215 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.916185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxfs\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-kube-api-access-cxxfs\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:07.953260 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.953229 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-ln8tk"] Apr 23 13:39:07.956609 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.956593 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:07.959365 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.959340 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 13:39:07.974143 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:07.974076 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ln8tk"] Apr 23 13:39:08.105500 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.105469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-certificates\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.105674 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.105525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntph\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-kube-api-access-gntph\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.105674 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.105617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:08.105766 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.105750 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:39:08.105801 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.105770 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:39:08.105801 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.105779 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d796l: references non-existent secret key: ca.crt Apr 23 13:39:08.105860 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.105831 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates podName:dd0325e5-5b61-43c7-930d-7380f8740cd9 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:09.105814778 +0000 UTC m=+415.649558983 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates") pod "keda-operator-ffbb595cb-d796l" (UID: "dd0325e5-5b61-43c7-930d-7380f8740cd9") : references non-existent secret key: ca.crt Apr 23 13:39:08.206218 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.206179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-certificates\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.206409 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.206354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gntph\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-kube-api-access-gntph\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.208785 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.208763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-certificates\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.224253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.224184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntph\" (UniqueName: \"kubernetes.io/projected/b400426c-c885-40d5-ace6-31b256e96541-kube-api-access-gntph\") pod \"keda-admission-cf49989db-ln8tk\" (UID: \"b400426c-c885-40d5-ace6-31b256e96541\") " pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.267277 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.267237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:08.399822 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.399794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-ln8tk"] Apr 23 13:39:08.401876 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:39:08.401840 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb400426c_c885_40d5_ace6_31b256e96541.slice/crio-95d713de4e5e9ebefce0de3a0619b9f0ba59ffe5f91c74b57e7643327a8e4609 WatchSource:0}: Error finding container 95d713de4e5e9ebefce0de3a0619b9f0ba59ffe5f91c74b57e7643327a8e4609: Status 404 returned error can't find the container with id 95d713de4e5e9ebefce0de3a0619b9f0ba59ffe5f91c74b57e7643327a8e4609 Apr 23 13:39:08.408092 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.408068 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:08.408484 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.408230 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:39:08.408484 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.408248 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:39:08.408484 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.408271 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws: references non-existent secret key: tls.crt Apr 23 13:39:08.408484 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:08.408332 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates podName:6413eea6-cd29-4f1e-a91f-83c778171097 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:09.40831345 +0000 UTC m=+415.952057642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates") pod "keda-metrics-apiserver-7c9f485588-qnfws" (UID: "6413eea6-cd29-4f1e-a91f-83c778171097") : references non-existent secret key: tls.crt Apr 23 13:39:08.514718 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:08.514630 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ln8tk" event={"ID":"b400426c-c885-40d5-ace6-31b256e96541","Type":"ContainerStarted","Data":"95d713de4e5e9ebefce0de3a0619b9f0ba59ffe5f91c74b57e7643327a8e4609"} Apr 23 13:39:09.115670 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:09.115637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:09.115866 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.115802 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:39:09.115866 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.115827 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:39:09.115866 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.115837 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d796l: references non-existent secret key: ca.crt Apr 23 13:39:09.116016 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.115889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates podName:dd0325e5-5b61-43c7-930d-7380f8740cd9 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:11.115875297 +0000 UTC m=+417.659619490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates") pod "keda-operator-ffbb595cb-d796l" (UID: "dd0325e5-5b61-43c7-930d-7380f8740cd9") : references non-existent secret key: ca.crt Apr 23 13:39:09.418697 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:09.418607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:09.419146 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.418775 2576 secret.go:281] references non-existent secret key: tls.crt Apr 23 13:39:09.419146 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.418803 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 23 13:39:09.419146 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.418827 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws: references non-existent secret key: tls.crt Apr 23 13:39:09.419146 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:09.418897 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates podName:6413eea6-cd29-4f1e-a91f-83c778171097 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:11.418878709 +0000 UTC m=+417.962622901 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates") pod "keda-metrics-apiserver-7c9f485588-qnfws" (UID: "6413eea6-cd29-4f1e-a91f-83c778171097") : references non-existent secret key: tls.crt Apr 23 13:39:10.523076 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:10.523043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-ln8tk" event={"ID":"b400426c-c885-40d5-ace6-31b256e96541","Type":"ContainerStarted","Data":"06f7ab7de6dfd19589bdbf5cad093136ae25cf01e6172ba5b452e306b3967770"} Apr 23 13:39:10.523447 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:10.523196 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:10.540458 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:10.540402 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-ln8tk" podStartSLOduration=2.12289612 podStartE2EDuration="3.540389082s" podCreationTimestamp="2026-04-23 13:39:07 +0000 UTC" firstStartedPulling="2026-04-23 13:39:08.403122286 +0000 UTC m=+414.946866478" lastFinishedPulling="2026-04-23 13:39:09.820615247 +0000 UTC m=+416.364359440" observedRunningTime="2026-04-23 13:39:10.537957526 +0000 UTC m=+417.081701743" watchObservedRunningTime="2026-04-23 13:39:10.540389082 +0000 UTC m=+417.084133296" Apr 23 13:39:11.135266 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:11.135222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:11.135423 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:11.135356 2576 secret.go:281] references non-existent secret key: ca.crt Apr 23 13:39:11.135423 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:11.135370 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 23 13:39:11.135423 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:11.135378 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-d796l: references non-existent secret key: ca.crt Apr 23 13:39:11.135570 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:39:11.135452 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates podName:dd0325e5-5b61-43c7-930d-7380f8740cd9 nodeName:}" failed. No retries permitted until 2026-04-23 13:39:15.135435171 +0000 UTC m=+421.679179364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates") pod "keda-operator-ffbb595cb-d796l" (UID: "dd0325e5-5b61-43c7-930d-7380f8740cd9") : references non-existent secret key: ca.crt Apr 23 13:39:11.438320 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:11.438229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:11.440739 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:11.440712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/6413eea6-cd29-4f1e-a91f-83c778171097-certificates\") pod \"keda-metrics-apiserver-7c9f485588-qnfws\" (UID: \"6413eea6-cd29-4f1e-a91f-83c778171097\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:11.636919 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:11.636882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:11.756127 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:11.756098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws"] Apr 23 13:39:11.758553 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:39:11.758527 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6413eea6_cd29_4f1e_a91f_83c778171097.slice/crio-2cff4a8ce140ac3276164de34015bd9efa60a0fcd0f2f45568bef83889100890 WatchSource:0}: Error finding container 2cff4a8ce140ac3276164de34015bd9efa60a0fcd0f2f45568bef83889100890: Status 404 returned error can't find the container with id 2cff4a8ce140ac3276164de34015bd9efa60a0fcd0f2f45568bef83889100890 Apr 23 13:39:12.530109 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:12.530075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" event={"ID":"6413eea6-cd29-4f1e-a91f-83c778171097","Type":"ContainerStarted","Data":"2cff4a8ce140ac3276164de34015bd9efa60a0fcd0f2f45568bef83889100890"} Apr 23 13:39:15.177402 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.177357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:15.179794 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.179769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/dd0325e5-5b61-43c7-930d-7380f8740cd9-certificates\") pod \"keda-operator-ffbb595cb-d796l\" (UID: \"dd0325e5-5b61-43c7-930d-7380f8740cd9\") " pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:15.219986 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.219913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:15.346730 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.346701 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-d796l"] Apr 23 13:39:15.349228 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:39:15.349192 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0325e5_5b61_43c7_930d_7380f8740cd9.slice/crio-433f5e6732554ac7185626d5023c62c5305d5dc7485987ea67717dbcde743002 WatchSource:0}: Error finding container 433f5e6732554ac7185626d5023c62c5305d5dc7485987ea67717dbcde743002: Status 404 returned error can't find the container with id 433f5e6732554ac7185626d5023c62c5305d5dc7485987ea67717dbcde743002 Apr 23 13:39:15.540734 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.540633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" event={"ID":"6413eea6-cd29-4f1e-a91f-83c778171097","Type":"ContainerStarted","Data":"648b6b7e05fc42fb426bfe13a80605b6cb294ae2798518e56bcd2a07ec0160ce"} Apr 23 13:39:15.540909 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.540825 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:15.541711 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.541683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-d796l" event={"ID":"dd0325e5-5b61-43c7-930d-7380f8740cd9","Type":"ContainerStarted","Data":"433f5e6732554ac7185626d5023c62c5305d5dc7485987ea67717dbcde743002"} Apr 23 13:39:15.559174 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:15.559108 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" podStartSLOduration=5.3579327 podStartE2EDuration="8.559096657s" podCreationTimestamp="2026-04-23 13:39:07 +0000 UTC" firstStartedPulling="2026-04-23 13:39:11.759919576 +0000 UTC m=+418.303663771" lastFinishedPulling="2026-04-23 13:39:14.961083521 +0000 UTC m=+421.504827728" observedRunningTime="2026-04-23 13:39:15.557263635 +0000 UTC m=+422.101007850" watchObservedRunningTime="2026-04-23 13:39:15.559096657 +0000 UTC m=+422.102840866" Apr 23 13:39:19.560518 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:19.560466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-d796l" event={"ID":"dd0325e5-5b61-43c7-930d-7380f8740cd9","Type":"ContainerStarted","Data":"635f7ddd0140d4394804f2e6da1109cef851c79e92b78882de88766179aded5c"} Apr 23 13:39:19.560921 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:19.560542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:39:19.576643 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:19.576585 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-d796l" podStartSLOduration=8.634170432 podStartE2EDuration="12.576569514s" podCreationTimestamp="2026-04-23 13:39:07 +0000 UTC" firstStartedPulling="2026-04-23 13:39:15.350923923 +0000 UTC m=+421.894668115" lastFinishedPulling="2026-04-23 13:39:19.293323004 +0000 UTC m=+425.837067197" observedRunningTime="2026-04-23 13:39:19.576043824 +0000 UTC m=+426.119788057" watchObservedRunningTime="2026-04-23 13:39:19.576569514 +0000 UTC m=+426.120313731" Apr 23 13:39:26.551018 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:26.550935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-qnfws" Apr 23 13:39:31.528309 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:31.528274 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-ln8tk" Apr 23 13:39:40.565730 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:39:40.565701 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-d796l" Apr 23 13:40:14.193295 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.193257 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:14.196781 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.196763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.199956 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.199939 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 13:40:14.199956 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.199949 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:40:14.200083 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.199975 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-nwr6l\"" Apr 23 13:40:14.201219 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.201195 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:40:14.207654 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.207634 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:14.309236 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.309204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.309403 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.309245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xtrt\" (UniqueName: \"kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.410782 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.410747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.410980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.410843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xtrt\" (UniqueName: \"kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.410980 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:40:14.410923 2576 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 23 13:40:14.411093 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:40:14.411001 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert podName:7d15459d-dc98-43f3-87f2-b6938334c45b nodeName:}" failed. No retries permitted until 2026-04-23 13:40:14.910978479 +0000 UTC m=+481.454722672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert") pod "kserve-controller-manager-6b667fdd66-vhxx9" (UID: "7d15459d-dc98-43f3-87f2-b6938334c45b") : secret "kserve-webhook-server-cert" not found Apr 23 13:40:14.419905 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.419880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xtrt\" (UniqueName: \"kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.916070 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.916034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:14.918537 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:14.918494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") pod \"kserve-controller-manager-6b667fdd66-vhxx9\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:15.106929 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:15.106895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:15.231456 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:40:15.231426 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d15459d_dc98_43f3_87f2_b6938334c45b.slice/crio-caecc9eb724c30a86154a30d1707d900efcbcaa45607adb9055ad66cb3559cc7 WatchSource:0}: Error finding container caecc9eb724c30a86154a30d1707d900efcbcaa45607adb9055ad66cb3559cc7: Status 404 returned error can't find the container with id caecc9eb724c30a86154a30d1707d900efcbcaa45607adb9055ad66cb3559cc7 Apr 23 13:40:15.231796 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:15.231548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:15.746241 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:15.746208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" event={"ID":"7d15459d-dc98-43f3-87f2-b6938334c45b","Type":"ContainerStarted","Data":"caecc9eb724c30a86154a30d1707d900efcbcaa45607adb9055ad66cb3559cc7"} Apr 23 13:40:18.757071 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:18.757028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" event={"ID":"7d15459d-dc98-43f3-87f2-b6938334c45b","Type":"ContainerStarted","Data":"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b"} Apr 23 13:40:18.757452 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:18.757160 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:18.774492 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:18.774442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" podStartSLOduration=2.227831553 podStartE2EDuration="4.774427176s" podCreationTimestamp="2026-04-23 13:40:14 +0000 UTC" firstStartedPulling="2026-04-23 13:40:15.232875437 +0000 UTC m=+481.776619629" lastFinishedPulling="2026-04-23 13:40:17.779471045 +0000 UTC m=+484.323215252" observedRunningTime="2026-04-23 13:40:18.773711009 +0000 UTC m=+485.317455233" watchObservedRunningTime="2026-04-23 13:40:18.774427176 +0000 UTC m=+485.318171392" Apr 23 13:40:49.766941 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:49.766910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:51.173314 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.173217 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:51.173727 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.173451 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" podUID="7d15459d-dc98-43f3-87f2-b6938334c45b" containerName="manager" containerID="cri-o://963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b" gracePeriod=10 Apr 23 13:40:51.198772 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.198745 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-qjtj6"] Apr 23 13:40:51.204471 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.204453 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.208993 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.208967 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-qjtj6"] Apr 23 13:40:51.238548 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.238494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zv6\" (UniqueName: \"kubernetes.io/projected/8c57b924-7f86-49fd-badd-28b68c92f31c-kube-api-access-z4zv6\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.238715 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.238624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c57b924-7f86-49fd-badd-28b68c92f31c-cert\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.339368 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.339333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zv6\" (UniqueName: \"kubernetes.io/projected/8c57b924-7f86-49fd-badd-28b68c92f31c-kube-api-access-z4zv6\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.339570 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.339448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c57b924-7f86-49fd-badd-28b68c92f31c-cert\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.342480 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.342451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c57b924-7f86-49fd-badd-28b68c92f31c-cert\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.350611 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.350585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zv6\" (UniqueName: \"kubernetes.io/projected/8c57b924-7f86-49fd-badd-28b68c92f31c-kube-api-access-z4zv6\") pod \"kserve-controller-manager-6b667fdd66-qjtj6\" (UID: \"8c57b924-7f86-49fd-badd-28b68c92f31c\") " pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.419602 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.419573 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:51.440451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.440368 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") pod \"7d15459d-dc98-43f3-87f2-b6938334c45b\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " Apr 23 13:40:51.440451 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.440444 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xtrt\" (UniqueName: \"kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt\") pod \"7d15459d-dc98-43f3-87f2-b6938334c45b\" (UID: \"7d15459d-dc98-43f3-87f2-b6938334c45b\") " Apr 23 13:40:51.442761 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.442730 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert" (OuterVolumeSpecName: "cert") pod "7d15459d-dc98-43f3-87f2-b6938334c45b" (UID: "7d15459d-dc98-43f3-87f2-b6938334c45b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:40:51.442899 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.442780 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt" (OuterVolumeSpecName: "kube-api-access-5xtrt") pod "7d15459d-dc98-43f3-87f2-b6938334c45b" (UID: "7d15459d-dc98-43f3-87f2-b6938334c45b"). InnerVolumeSpecName "kube-api-access-5xtrt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:40:51.541466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.541429 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d15459d-dc98-43f3-87f2-b6938334c45b-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:40:51.541466 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.541463 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xtrt\" (UniqueName: \"kubernetes.io/projected/7d15459d-dc98-43f3-87f2-b6938334c45b-kube-api-access-5xtrt\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:40:51.558305 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.558263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:51.688011 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.687927 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-qjtj6"] Apr 23 13:40:51.690390 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:40:51.690361 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c57b924_7f86_49fd_badd_28b68c92f31c.slice/crio-a16f5c2009eb047028f8adb93dd191127dfbe9b7a9f6a2fc690e658599ac2421 WatchSource:0}: Error finding container a16f5c2009eb047028f8adb93dd191127dfbe9b7a9f6a2fc690e658599ac2421: Status 404 returned error can't find the container with id a16f5c2009eb047028f8adb93dd191127dfbe9b7a9f6a2fc690e658599ac2421 Apr 23 13:40:51.877362 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.877324 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d15459d-dc98-43f3-87f2-b6938334c45b" containerID="963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b" exitCode=0 Apr 23 13:40:51.877574 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.877416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" event={"ID":"7d15459d-dc98-43f3-87f2-b6938334c45b","Type":"ContainerDied","Data":"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b"} Apr 23 13:40:51.877574 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.877447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" event={"ID":"7d15459d-dc98-43f3-87f2-b6938334c45b","Type":"ContainerDied","Data":"caecc9eb724c30a86154a30d1707d900efcbcaa45607adb9055ad66cb3559cc7"} Apr 23 13:40:51.877574 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.877474 2576 scope.go:117] "RemoveContainer" containerID="963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b" Apr 23 13:40:51.877748 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.877724 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-vhxx9" Apr 23 13:40:51.879459 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.879423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" event={"ID":"8c57b924-7f86-49fd-badd-28b68c92f31c","Type":"ContainerStarted","Data":"a16f5c2009eb047028f8adb93dd191127dfbe9b7a9f6a2fc690e658599ac2421"} Apr 23 13:40:51.887639 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.887615 2576 scope.go:117] "RemoveContainer" containerID="963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b" Apr 23 13:40:51.887950 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:40:51.887928 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b\": container with ID starting with 963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b not found: ID does not exist" containerID="963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b" Apr 23 13:40:51.888016 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.887964 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b"} err="failed to get container status \"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b\": rpc error: code = NotFound desc = could not find container \"963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b\": container with ID starting with 963f236c48316b5223c8c75ee2872df61f615c595c36abae77bc6c205ac36d5b not found: ID does not exist" Apr 23 13:40:51.900788 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.900749 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:51.905826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:51.905792 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-vhxx9"] Apr 23 13:40:52.062023 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:52.061992 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d15459d-dc98-43f3-87f2-b6938334c45b" path="/var/lib/kubelet/pods/7d15459d-dc98-43f3-87f2-b6938334c45b/volumes" Apr 23 13:40:52.885063 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:52.885031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" event={"ID":"8c57b924-7f86-49fd-badd-28b68c92f31c","Type":"ContainerStarted","Data":"2ef917e9c9db04d478aa1cd459260fa28b66934052e7d2383eeabea56a0f4cf3"} Apr 23 13:40:52.885463 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:52.885156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:40:52.906713 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:40:52.906661 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" podStartSLOduration=1.537728092 podStartE2EDuration="1.906647179s" podCreationTimestamp="2026-04-23 13:40:51 +0000 UTC" firstStartedPulling="2026-04-23 13:40:51.691705572 +0000 UTC m=+518.235449764" lastFinishedPulling="2026-04-23 13:40:52.060624656 +0000 UTC m=+518.604368851" observedRunningTime="2026-04-23 13:40:52.904962374 +0000 UTC m=+519.448706602" watchObservedRunningTime="2026-04-23 13:40:52.906647179 +0000 UTC m=+519.450391443" Apr 23 13:41:23.893805 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:23.893774 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-qjtj6" Apr 23 13:41:29.691581 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.691519 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f4d876c46-rcfwv"] Apr 23 13:41:29.692154 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.692073 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d15459d-dc98-43f3-87f2-b6938334c45b" containerName="manager" Apr 23 13:41:29.692154 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.692095 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d15459d-dc98-43f3-87f2-b6938334c45b" containerName="manager" Apr 23 13:41:29.692253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.692161 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d15459d-dc98-43f3-87f2-b6938334c45b" containerName="manager" Apr 23 13:41:29.695222 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.695196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.705962 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.705937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4d876c46-rcfwv"] Apr 23 13:41:29.782995 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.782956 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783169 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783169 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-service-ca\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783169 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-trusted-ca-bundle\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783169 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-oauth-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783169 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdctb\" (UniqueName: \"kubernetes.io/projected/d29fd114-9bb4-4e4e-84fb-0744f02189f0-kube-api-access-cdctb\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.783350 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.783187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-oauth-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884071 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884022 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884275 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884275 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-service-ca\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884275 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-trusted-ca-bundle\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884275 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-oauth-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884275 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdctb\" (UniqueName: \"kubernetes.io/projected/d29fd114-9bb4-4e4e-84fb-0744f02189f0-kube-api-access-cdctb\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.884553 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-oauth-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.885026 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.884995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-service-ca\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.885156 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.885002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.885156 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.885093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-oauth-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.885156 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.885110 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d29fd114-9bb4-4e4e-84fb-0744f02189f0-trusted-ca-bundle\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.886659 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.886630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-serving-cert\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.886802 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.886782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d29fd114-9bb4-4e4e-84fb-0744f02189f0-console-oauth-config\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:29.891826 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:29.891809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdctb\" (UniqueName: \"kubernetes.io/projected/d29fd114-9bb4-4e4e-84fb-0744f02189f0-kube-api-access-cdctb\") pod \"console-6f4d876c46-rcfwv\" (UID: \"d29fd114-9bb4-4e4e-84fb-0744f02189f0\") " pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:30.006893 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:30.006789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:30.133818 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:30.133793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f4d876c46-rcfwv"] Apr 23 13:41:30.135473 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:41:30.135446 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd29fd114_9bb4_4e4e_84fb_0744f02189f0.slice/crio-7dab183b9a9c5b982f46fee2773539354617e8aab85a94ef9a8a9fb24cc6e27d WatchSource:0}: Error finding container 7dab183b9a9c5b982f46fee2773539354617e8aab85a94ef9a8a9fb24cc6e27d: Status 404 returned error can't find the container with id 7dab183b9a9c5b982f46fee2773539354617e8aab85a94ef9a8a9fb24cc6e27d Apr 23 13:41:31.014414 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:31.014378 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4d876c46-rcfwv" event={"ID":"d29fd114-9bb4-4e4e-84fb-0744f02189f0","Type":"ContainerStarted","Data":"5da60fb8f159523425ae53808a81cab4d06f40edeadf26cbe2dc33e6135e3bd2"} Apr 23 13:41:31.014812 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:31.014421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f4d876c46-rcfwv" event={"ID":"d29fd114-9bb4-4e4e-84fb-0744f02189f0","Type":"ContainerStarted","Data":"7dab183b9a9c5b982f46fee2773539354617e8aab85a94ef9a8a9fb24cc6e27d"} Apr 23 13:41:31.032065 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:31.032009 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f4d876c46-rcfwv" podStartSLOduration=2.031992012 podStartE2EDuration="2.031992012s" podCreationTimestamp="2026-04-23 13:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:41:31.03120311 +0000 UTC m=+557.574947337" watchObservedRunningTime="2026-04-23 13:41:31.031992012 +0000 UTC m=+557.575736227" Apr 23 13:41:40.007647 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:40.007596 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:40.007647 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:40.007651 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:40.012545 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:40.012487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:40.049147 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:40.049120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f4d876c46-rcfwv" Apr 23 13:41:40.100970 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:41:40.100932 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:42:00.386248 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.386216 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:42:00.389852 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.389833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.392723 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.392697 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 23 13:42:00.392874 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.392855 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:42:00.393981 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.393964 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:42:00.394064 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.394050 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tk4rw\"" Apr 23 13:42:00.394112 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.394070 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 23 13:42:00.400032 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.400009 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:42:00.460957 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.460920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6rd2\" (UniqueName: \"kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.461149 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.460967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.461149 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.461045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.461149 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.461112 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562064 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6rd2\" (UniqueName: \"kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562226 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562226 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562226 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562548 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.562859 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.562826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.564839 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.564820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.574993 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.574947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6rd2\" (UniqueName: \"kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2\") pod \"isvc-xgboost-graph-predictor-669d8d6456-kk2qw\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.702217 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.702092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:00.829307 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:00.829281 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:42:00.831688 ip-10-0-137-177 kubenswrapper[2576]: W0423 13:42:00.831659 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6037a5f7_804d_40a1_8f02_adb635079917.slice/crio-9371b9aae0d9594a0aeefb6fd3a3f5a7ef5c3aa352d1a63a6a7b4968b293aff9 WatchSource:0}: Error finding container 9371b9aae0d9594a0aeefb6fd3a3f5a7ef5c3aa352d1a63a6a7b4968b293aff9: Status 404 returned error can't find the container with id 9371b9aae0d9594a0aeefb6fd3a3f5a7ef5c3aa352d1a63a6a7b4968b293aff9 Apr 23 13:42:01.115732 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:01.115693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerStarted","Data":"9371b9aae0d9594a0aeefb6fd3a3f5a7ef5c3aa352d1a63a6a7b4968b293aff9"} Apr 23 13:42:05.123711 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.123661 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6c9b88cbd8-bgvcf" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerName="console" containerID="cri-o://90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd" gracePeriod=15 Apr 23 13:42:05.134424 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.134394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerStarted","Data":"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e"} Apr 23 13:42:05.261149 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.261112 2576 patch_prober.go:28] interesting pod/console-6c9b88cbd8-bgvcf container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" start-of-body= Apr 23 13:42:05.261310 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.261169 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-6c9b88cbd8-bgvcf" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerName="console" probeResult="failure" output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" Apr 23 13:42:05.377487 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.377425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c9b88cbd8-bgvcf_e5af15ae-e314-4f48-9b7a-82602b85d57a/console/0.log" Apr 23 13:42:05.377641 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.377497 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:42:05.512603 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512570 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512620 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512674 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512717 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f449\" (UniqueName: \"kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512744 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512759 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.512798 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.512786 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert\") pod \"e5af15ae-e314-4f48-9b7a-82602b85d57a\" (UID: \"e5af15ae-e314-4f48-9b7a-82602b85d57a\") " Apr 23 13:42:05.513171 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.513135 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:05.513296 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.513200 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:05.513296 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.513210 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca" (OuterVolumeSpecName: "service-ca") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:05.513384 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.513335 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config" (OuterVolumeSpecName: "console-config") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:42:05.515161 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.515135 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:05.515253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.515131 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449" (OuterVolumeSpecName: "kube-api-access-5f449") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "kube-api-access-5f449". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:42:05.515253 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.515210 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e5af15ae-e314-4f48-9b7a-82602b85d57a" (UID: "e5af15ae-e314-4f48-9b7a-82602b85d57a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:42:05.613605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613569 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5f449\" (UniqueName: \"kubernetes.io/projected/e5af15ae-e314-4f48-9b7a-82602b85d57a-kube-api-access-5f449\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613600 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-oauth-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613605 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613612 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613870 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613622 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5af15ae-e314-4f48-9b7a-82602b85d57a-console-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613870 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613630 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-trusted-ca-bundle\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613870 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613639 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-oauth-serving-cert\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:05.613870 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:05.613650 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e5af15ae-e314-4f48-9b7a-82602b85d57a-service-ca\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:42:06.139134 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c9b88cbd8-bgvcf_e5af15ae-e314-4f48-9b7a-82602b85d57a/console/0.log" Apr 23 13:42:06.139587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139150 2576 generic.go:358] "Generic (PLEG): container finished" podID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerID="90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd" exitCode=2 Apr 23 13:42:06.139587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139221 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c9b88cbd8-bgvcf" Apr 23 13:42:06.139587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c9b88cbd8-bgvcf" event={"ID":"e5af15ae-e314-4f48-9b7a-82602b85d57a","Type":"ContainerDied","Data":"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd"} Apr 23 13:42:06.139587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c9b88cbd8-bgvcf" event={"ID":"e5af15ae-e314-4f48-9b7a-82602b85d57a","Type":"ContainerDied","Data":"43a0b9b1a61c160435f3017065c677d7cf1e921930b4960c80412d928b44d32c"} Apr 23 13:42:06.139587 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.139326 2576 scope.go:117] "RemoveContainer" containerID="90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd" Apr 23 13:42:06.147476 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.147456 2576 scope.go:117] "RemoveContainer" containerID="90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd" Apr 23 13:42:06.147746 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:42:06.147725 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd\": container with ID starting with 90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd not found: ID does not exist" containerID="90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd" Apr 23 13:42:06.147796 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.147755 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd"} err="failed to get container status \"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd\": rpc error: code = NotFound desc = could not find container \"90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd\": container with ID starting with 90696de39153ea73edc33fd9459862ce8591b297ba2992535b63cf6855c778cd not found: ID does not exist" Apr 23 13:42:06.158282 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.158256 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:42:06.161787 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:06.161766 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6c9b88cbd8-bgvcf"] Apr 23 13:42:08.061460 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:08.061426 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" path="/var/lib/kubelet/pods/e5af15ae-e314-4f48-9b7a-82602b85d57a/volumes" Apr 23 13:42:09.152056 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:09.152026 2576 generic.go:358] "Generic (PLEG): container finished" podID="6037a5f7-804d-40a1-8f02-adb635079917" containerID="b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e" exitCode=0 Apr 23 13:42:09.152056 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:09.152061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerDied","Data":"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e"} Apr 23 13:42:13.964940 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:13.964911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:42:13.965667 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:13.965642 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:42:28.227830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:28.227788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerStarted","Data":"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a"} Apr 23 13:42:30.239738 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:30.239703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerStarted","Data":"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa"} Apr 23 13:42:30.240230 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:30.239967 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:30.261104 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:30.261044 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podStartSLOduration=1.013929095 podStartE2EDuration="30.261029536s" podCreationTimestamp="2026-04-23 13:42:00 +0000 UTC" firstStartedPulling="2026-04-23 13:42:00.833419612 +0000 UTC m=+587.377163804" lastFinishedPulling="2026-04-23 13:42:30.080520033 +0000 UTC m=+616.624264245" observedRunningTime="2026-04-23 13:42:30.258152539 +0000 UTC m=+616.801896754" watchObservedRunningTime="2026-04-23 13:42:30.261029536 +0000 UTC m=+616.804773796" Apr 23 13:42:31.243556 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:31.243524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:31.244874 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:31.244844 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:32.246492 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:32.246449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:37.250821 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:37.250791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:42:37.251408 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:37.251382 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:47.251855 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:47.251814 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:42:57.251612 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:42:57.251566 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:07.251986 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:43:07.251939 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:17.251476 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:43:17.251436 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:27.252083 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:43:27.252047 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 23 13:43:37.252140 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:43:37.252111 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:44:10.440824 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:10.440735 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:44:10.441352 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:10.441121 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" containerID="cri-o://56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a" gracePeriod=30 Apr 23 13:44:10.441352 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:10.441270 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kube-rbac-proxy" containerID="cri-o://3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa" gracePeriod=30 Apr 23 13:44:10.576130 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:10.576099 2576 generic.go:358] "Generic (PLEG): container finished" podID="6037a5f7-804d-40a1-8f02-adb635079917" containerID="3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa" exitCode=2 Apr 23 13:44:10.576297 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:10.576174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerDied","Data":"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa"} Apr 23 13:44:12.247371 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:12.247323 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.134.0.35:8643/healthz\": dial tcp 10.134.0.35:8643: connect: connection refused" Apr 23 13:44:14.130430 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.130390 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:44:14.198568 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198476 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6rd2\" (UniqueName: \"kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2\") pod \"6037a5f7-804d-40a1-8f02-adb635079917\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " Apr 23 13:44:14.198568 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location\") pod \"6037a5f7-804d-40a1-8f02-adb635079917\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " Apr 23 13:44:14.198755 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198604 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"6037a5f7-804d-40a1-8f02-adb635079917\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " Apr 23 13:44:14.198755 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198659 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls\") pod \"6037a5f7-804d-40a1-8f02-adb635079917\" (UID: \"6037a5f7-804d-40a1-8f02-adb635079917\") " Apr 23 13:44:14.198909 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6037a5f7-804d-40a1-8f02-adb635079917" (UID: "6037a5f7-804d-40a1-8f02-adb635079917"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:44:14.198971 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.198940 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "6037a5f7-804d-40a1-8f02-adb635079917" (UID: "6037a5f7-804d-40a1-8f02-adb635079917"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:44:14.200816 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.200791 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6037a5f7-804d-40a1-8f02-adb635079917" (UID: "6037a5f7-804d-40a1-8f02-adb635079917"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:44:14.200888 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.200851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2" (OuterVolumeSpecName: "kube-api-access-s6rd2") pod "6037a5f7-804d-40a1-8f02-adb635079917" (UID: "6037a5f7-804d-40a1-8f02-adb635079917"). InnerVolumeSpecName "kube-api-access-s6rd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:14.299980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.299935 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6rd2\" (UniqueName: \"kubernetes.io/projected/6037a5f7-804d-40a1-8f02-adb635079917-kube-api-access-s6rd2\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.299980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.299972 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6037a5f7-804d-40a1-8f02-adb635079917-kserve-provision-location\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.299980 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.299988 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6037a5f7-804d-40a1-8f02-adb635079917-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.300221 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.300002 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6037a5f7-804d-40a1-8f02-adb635079917-proxy-tls\") on node \"ip-10-0-137-177.ec2.internal\" DevicePath \"\"" Apr 23 13:44:14.591013 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.590975 2576 generic.go:358] "Generic (PLEG): container finished" podID="6037a5f7-804d-40a1-8f02-adb635079917" containerID="56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a" exitCode=0 Apr 23 13:44:14.591186 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.591063 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" Apr 23 13:44:14.591186 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.591063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerDied","Data":"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a"} Apr 23 13:44:14.591186 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.591106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw" event={"ID":"6037a5f7-804d-40a1-8f02-adb635079917","Type":"ContainerDied","Data":"9371b9aae0d9594a0aeefb6fd3a3f5a7ef5c3aa352d1a63a6a7b4968b293aff9"} Apr 23 13:44:14.591186 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.591128 2576 scope.go:117] "RemoveContainer" containerID="3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa" Apr 23 13:44:14.599594 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.599569 2576 scope.go:117] "RemoveContainer" containerID="56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a" Apr 23 13:44:14.607752 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.607731 2576 scope.go:117] "RemoveContainer" containerID="b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e" Apr 23 13:44:14.614974 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.614948 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:44:14.615202 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.615188 2576 scope.go:117] "RemoveContainer" containerID="3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa" Apr 23 13:44:14.615491 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:44:14.615462 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa\": container with ID starting with 3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa not found: ID does not exist" containerID="3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa" Apr 23 13:44:14.615608 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.615497 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa"} err="failed to get container status \"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa\": rpc error: code = NotFound desc = could not find container \"3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa\": container with ID starting with 3cb7a2ffcad0d0907267fb993dd3f52a731be70b6a9fd19a3f2da5118c16aafa not found: ID does not exist" Apr 23 13:44:14.615608 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.615570 2576 scope.go:117] "RemoveContainer" containerID="56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a" Apr 23 13:44:14.615792 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:44:14.615776 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a\": container with ID starting with 56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a not found: ID does not exist" containerID="56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a" Apr 23 13:44:14.615830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.615796 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a"} err="failed to get container status \"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a\": rpc error: code = NotFound desc = could not find container \"56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a\": container with ID starting with 56b23cb64d7a6c82d614c74f8480e6d009d51f354ae2a4045ef52c5169cfe23a not found: ID does not exist" Apr 23 13:44:14.615830 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.615807 2576 scope.go:117] "RemoveContainer" containerID="b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e" Apr 23 13:44:14.616031 ip-10-0-137-177 kubenswrapper[2576]: E0423 13:44:14.616016 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e\": container with ID starting with b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e not found: ID does not exist" containerID="b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e" Apr 23 13:44:14.616067 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.616035 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e"} err="failed to get container status \"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e\": rpc error: code = NotFound desc = could not find container \"b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e\": container with ID starting with b24aa94cd0f93b73e4933b6ac50d50a6b1ad9c470bcccef43c1cfd26692d836e not found: ID does not exist" Apr 23 13:44:14.620611 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:14.620590 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-kk2qw"] Apr 23 13:44:16.061719 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:44:16.061682 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6037a5f7-804d-40a1-8f02-adb635079917" path="/var/lib/kubelet/pods/6037a5f7-804d-40a1-8f02-adb635079917/volumes" Apr 23 13:47:13.990943 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:47:13.990911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:47:13.992430 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:47:13.992406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:52:14.017446 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:52:14.017414 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:52:14.018984 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:52:14.018964 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:57:14.046376 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:57:14.046341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 13:57:14.048729 ip-10-0-137-177 kubenswrapper[2576]: I0423 13:57:14.047732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:02:14.074687 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:02:14.074580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:02:14.078806 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:02:14.076155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:07:14.104619 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:07:14.104499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:07:14.108593 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:07:14.106974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:12:14.131847 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:12:14.131724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:12:14.136014 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:12:14.134881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:17:14.157523 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:17:14.157382 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:17:14.161443 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:17:14.159606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:21:39.357658 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:39.357572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-v6hlj_db5c8cc0-562a-4b09-9003-11f2a78bd2a6/global-pull-secret-syncer/0.log" Apr 23 14:21:39.502448 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:39.502419 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dpm52_5762a240-1436-4c52-bead-2abd75c01895/konnectivity-agent/0.log" Apr 23 14:21:39.588958 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:39.588924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-177.ec2.internal_06e1c4c209f8543cc577f01ef69cf08e/haproxy/0.log" Apr 23 14:21:43.026239 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.026202 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/alertmanager/0.log" Apr 23 14:21:43.057389 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.057353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/config-reloader/0.log" Apr 23 14:21:43.080664 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.080622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/kube-rbac-proxy-web/0.log" Apr 23 14:21:43.105526 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.105475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/kube-rbac-proxy/0.log" Apr 23 14:21:43.128782 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.128745 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/kube-rbac-proxy-metric/0.log" Apr 23 14:21:43.152485 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.152460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/prom-label-proxy/0.log" Apr 23 14:21:43.176818 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.176789 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_cd41a95f-bc17-4277-9383-5fc99b246329/init-config-reloader/0.log" Apr 23 14:21:43.226726 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.226690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6fvr4_becd3753-8920-40b9-bbff-58dc7e26e9b4/cluster-monitoring-operator/0.log" Apr 23 14:21:43.372250 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.372219 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-587d985fcf-ptfdp_3b4bd3ee-f28d-44dc-a47b-9d12d6f3945e/metrics-server/0.log" Apr 23 14:21:43.535409 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.535378 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s7569_b0f2f937-55f3-482e-9e4e-bc3bfce5a791/node-exporter/0.log" Apr 23 14:21:43.555438 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.555406 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s7569_b0f2f937-55f3-482e-9e4e-bc3bfce5a791/kube-rbac-proxy/0.log" Apr 23 14:21:43.575610 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.575582 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s7569_b0f2f937-55f3-482e-9e4e-bc3bfce5a791/init-textfile/0.log" Apr 23 14:21:43.683327 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.683249 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2dcf_db1003b5-8c79-4580-8717-41e4565a67a7/kube-rbac-proxy-main/0.log" Apr 23 14:21:43.706201 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.706174 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2dcf_db1003b5-8c79-4580-8717-41e4565a67a7/kube-rbac-proxy-self/0.log" Apr 23 14:21:43.730128 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:43.730097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-k2dcf_db1003b5-8c79-4580-8717-41e4565a67a7/openshift-state-metrics/0.log" Apr 23 14:21:44.034537 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.034341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65cdfd779-glvmv_fc0d4cb9-f2b6-4e44-b6d4-49011df8f021/telemeter-client/0.log" Apr 23 14:21:44.059139 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.059108 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65cdfd779-glvmv_fc0d4cb9-f2b6-4e44-b6d4-49011df8f021/reload/0.log" Apr 23 14:21:44.082523 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.082482 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-65cdfd779-glvmv_fc0d4cb9-f2b6-4e44-b6d4-49011df8f021/kube-rbac-proxy/0.log" Apr 23 14:21:44.131669 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.131632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/thanos-query/0.log" Apr 23 14:21:44.165641 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.165615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/kube-rbac-proxy-web/0.log" Apr 23 14:21:44.197258 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.197231 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/kube-rbac-proxy/0.log" Apr 23 14:21:44.230531 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.230479 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/prom-label-proxy/0.log" Apr 23 14:21:44.266481 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.266454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/kube-rbac-proxy-rules/0.log" Apr 23 14:21:44.296917 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:44.296887 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6ccccfc88b-ddhzw_87058acd-6bca-432b-b6aa-6f61500ac7f8/kube-rbac-proxy-metrics/0.log" Apr 23 14:21:45.401803 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:45.401771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-77v7z_3dcf186b-93ff-4283-9c0b-ec05a6c706a4/networking-console-plugin/0.log" Apr 23 14:21:45.830554 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:45.830499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/2.log" Apr 23 14:21:45.838699 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:45.838666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-s48s8_9aeb729e-46fa-42be-8d0f-9045eabfad26/console-operator/3.log" Apr 23 14:21:46.194703 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.194616 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f4d876c46-rcfwv_d29fd114-9bb4-4e4e-84fb-0744f02189f0/console/0.log" Apr 23 14:21:46.236312 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.236280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7tkrn_83cc2d44-2519-43b2-87a7-a7cae9cf2256/download-server/0.log" Apr 23 14:21:46.555834 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.555799 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb"] Apr 23 14:21:46.556193 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556161 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kube-rbac-proxy" Apr 23 14:21:46.556193 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556173 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kube-rbac-proxy" Apr 23 14:21:46.556193 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556192 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerName="console" Apr 23 14:21:46.556288 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556198 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerName="console" Apr 23 14:21:46.556288 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556204 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="storage-initializer" Apr 23 14:21:46.556288 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556211 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="storage-initializer" Apr 23 14:21:46.556288 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556217 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" Apr 23 14:21:46.556288 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556222 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" Apr 23 14:21:46.556432 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556297 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kserve-container" Apr 23 14:21:46.556432 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556309 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6037a5f7-804d-40a1-8f02-adb635079917" containerName="kube-rbac-proxy" Apr 23 14:21:46.556432 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.556316 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5af15ae-e314-4f48-9b7a-82602b85d57a" containerName="console" Apr 23 14:21:46.559263 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.559248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.562853 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.562830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"kube-root-ca.crt\"" Apr 23 14:21:46.562997 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.562830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-cvccm\"/\"openshift-service-ca.crt\"" Apr 23 14:21:46.562997 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.562898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-cvccm\"/\"default-dockercfg-2cmhh\"" Apr 23 14:21:46.570353 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.570330 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb"] Apr 23 14:21:46.612600 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.612569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm6xv\" (UniqueName: \"kubernetes.io/projected/9a387817-8ab8-4dc6-9011-d0e947a5cb83-kube-api-access-dm6xv\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.612771 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.612608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-sys\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.612771 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.612632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-proc\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.612771 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.612711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-podres\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.612771 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.612738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-lib-modules\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714150 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-lib-modules\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714340 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm6xv\" (UniqueName: \"kubernetes.io/projected/9a387817-8ab8-4dc6-9011-d0e947a5cb83-kube-api-access-dm6xv\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714340 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714278 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-sys\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714340 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-proc\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-podres\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-lib-modules\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-sys\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-proc\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.714473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.714442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9a387817-8ab8-4dc6-9011-d0e947a5cb83-podres\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.724142 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.724109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm6xv\" (UniqueName: \"kubernetes.io/projected/9a387817-8ab8-4dc6-9011-d0e947a5cb83-kube-api-access-dm6xv\") pod \"perf-node-gather-daemonset-xgjvb\" (UID: \"9a387817-8ab8-4dc6-9011-d0e947a5cb83\") " pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:46.869368 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:46.869264 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:47.000229 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.000204 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb"] Apr 23 14:21:47.002868 ip-10-0-137-177 kubenswrapper[2576]: W0423 14:21:47.002839 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9a387817_8ab8_4dc6_9011_d0e947a5cb83.slice/crio-b08fc3b042651d5c3e06b00cc24c8d65379743529d5736033909bab2c7e63534 WatchSource:0}: Error finding container b08fc3b042651d5c3e06b00cc24c8d65379743529d5736033909bab2c7e63534: Status 404 returned error can't find the container with id b08fc3b042651d5c3e06b00cc24c8d65379743529d5736033909bab2c7e63534 Apr 23 14:21:47.004518 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.004483 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:21:47.178473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.178441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" event={"ID":"9a387817-8ab8-4dc6-9011-d0e947a5cb83","Type":"ContainerStarted","Data":"c05a0688661c1d2c5822595df82fa277cb8a9d95d6722f3c2a8bacdb02985857"} Apr 23 14:21:47.178473 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.178476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" event={"ID":"9a387817-8ab8-4dc6-9011-d0e947a5cb83","Type":"ContainerStarted","Data":"b08fc3b042651d5c3e06b00cc24c8d65379743529d5736033909bab2c7e63534"} Apr 23 14:21:47.178675 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.178543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:47.200227 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.200171 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" podStartSLOduration=1.20013457 podStartE2EDuration="1.20013457s" podCreationTimestamp="2026-04-23 14:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:21:47.19846426 +0000 UTC m=+2973.742208486" watchObservedRunningTime="2026-04-23 14:21:47.20013457 +0000 UTC m=+2973.743878785" Apr 23 14:21:47.438940 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.438860 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zrlv_d6361136-0129-4e11-9891-a7117fbe5be5/dns/0.log" Apr 23 14:21:47.461072 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.461046 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zrlv_d6361136-0129-4e11-9891-a7117fbe5be5/kube-rbac-proxy/0.log" Apr 23 14:21:47.551813 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:47.551785 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v6kpm_e662fef7-fd2a-4a55-91de-e3ed361dab06/dns-node-resolver/0.log" Apr 23 14:21:48.017498 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:48.017463 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-849f746c6-4s62k_9c8b64c2-2e90-4120-aa4f-0cc5680ac7eb/registry/0.log" Apr 23 14:21:48.089283 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:48.089255 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n7dgj_de702b67-cf80-4b1f-b30b-e4a459ac038e/node-ca/0.log" Apr 23 14:21:48.874782 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:48.874755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6f688cbc9d-w6cb7_a88a928d-64c1-4284-b688-d0ec2e231c16/router/0.log" Apr 23 14:21:49.219396 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.219314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5s4f6_24201bdd-9893-495a-8f70-680500f3a31d/serve-healthcheck-canary/0.log" Apr 23 14:21:49.661769 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.661734 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tpxdm_60898bb3-109a-472a-a90e-9b1a908a6d36/insights-operator/0.log" Apr 23 14:21:49.662843 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.662818 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-tpxdm_60898bb3-109a-472a-a90e-9b1a908a6d36/insights-operator/1.log" Apr 23 14:21:49.683831 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.683804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkz2n_148deae9-d0c4-4c5d-ba07-4e99ac4b8c07/kube-rbac-proxy/0.log" Apr 23 14:21:49.703697 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.703667 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkz2n_148deae9-d0c4-4c5d-ba07-4e99ac4b8c07/exporter/0.log" Apr 23 14:21:49.727542 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:49.727499 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bkz2n_148deae9-d0c4-4c5d-ba07-4e99ac4b8c07/extractor/0.log" Apr 23 14:21:51.842911 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:51.842879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6b667fdd66-qjtj6_8c57b924-7f86-49fd-badd-28b68c92f31c/manager/0.log" Apr 23 14:21:53.192675 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:53.192641 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-cvccm/perf-node-gather-daemonset-xgjvb" Apr 23 14:21:56.568999 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:56.568963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-r59bz_0b1e6cd3-17a8-4f73-b12c-4f3725a10c29/kube-storage-version-migrator-operator/1.log" Apr 23 14:21:56.570931 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:56.570904 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-r59bz_0b1e6cd3-17a8-4f73-b12c-4f3725a10c29/kube-storage-version-migrator-operator/0.log" Apr 23 14:21:57.526324 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.526294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qqkv_6584cca1-f6ed-4d94-8644-5eb9b59e13e6/kube-multus/0.log" Apr 23 14:21:57.555483 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.555454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/kube-multus-additional-cni-plugins/0.log" Apr 23 14:21:57.574436 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.574412 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/egress-router-binary-copy/0.log" Apr 23 14:21:57.596251 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.596218 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/cni-plugins/0.log" Apr 23 14:21:57.619903 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.619872 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/bond-cni-plugin/0.log" Apr 23 14:21:57.640143 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.640074 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/routeoverride-cni/0.log" Apr 23 14:21:57.662027 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.661998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/whereabouts-cni-bincopy/0.log" Apr 23 14:21:57.682352 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:57.682327 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hhztr_8b934c24-9a04-47cb-a0a9-ce2109c8b735/whereabouts-cni/0.log" Apr 23 14:21:58.133333 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:58.133297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wzp5m_fde80200-8a4e-4844-91f0-ed8f18a92617/network-metrics-daemon/0.log" Apr 23 14:21:58.160237 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:58.160202 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wzp5m_fde80200-8a4e-4844-91f0-ed8f18a92617/kube-rbac-proxy/0.log" Apr 23 14:21:59.161563 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.161533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/ovn-controller/0.log" Apr 23 14:21:59.202236 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.202201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/ovn-acl-logging/0.log" Apr 23 14:21:59.224681 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.224633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/kube-rbac-proxy-node/0.log" Apr 23 14:21:59.247263 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.247241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:21:59.264725 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.264696 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/northd/0.log" Apr 23 14:21:59.285117 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.285091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/nbdb/0.log" Apr 23 14:21:59.306325 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.306297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/sbdb/0.log" Apr 23 14:21:59.480779 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:21:59.480693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-lhgvj_2b98e81e-dc6f-4d15-b8ec-77a01c0ee951/ovnkube-controller/0.log" Apr 23 14:22:00.784933 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:22:00.784906 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-thrhb_03ff5b18-36df-4e34-93ea-5d57a4bf949b/check-endpoints/0.log" Apr 23 14:22:00.856147 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:22:00.856118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fnd8j_73b441a9-2b94-42af-ba5d-7d626ce72613/network-check-target-container/0.log" Apr 23 14:22:01.739227 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:22:01.739192 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7jsd5_e2865e96-7bac-4087-bb1b-0cf266b4deb0/iptables-alerter/0.log" Apr 23 14:22:02.428042 ip-10-0-137-177 kubenswrapper[2576]: I0423 14:22:02.428004 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-mcpv6_047f80fa-7458-4de1-b0e4-f52fea4fbe72/tuned/0.log"