Apr 22 21:06:31.523376 ip-10-0-143-252 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 21:06:31.523387 ip-10-0-143-252 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 21:06:31.523397 ip-10-0-143-252 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 21:06:31.523734 ip-10-0-143-252 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 21:06:41.772481 ip-10-0-143-252 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 21:06:41.772498 ip-10-0-143-252 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot bfa38d30dbe0421c9f91234645c9906c -- Apr 22 21:09:13.126699 ip-10-0-143-252 systemd[1]: Starting Kubernetes Kubelet... Apr 22 21:09:13.583276 ip-10-0-143-252 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:13.583276 ip-10-0-143-252 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 21:09:13.583276 ip-10-0-143-252 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:13.583276 ip-10-0-143-252 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 21:09:13.583276 ip-10-0-143-252 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:13.584786 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.584642 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 21:09:13.590932 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.590888 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:13.591064 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591050 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:13.591064 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591057 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:13.591064 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591063 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:13.591064 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591066 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591069 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591072 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591076 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591079 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591082 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591084 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591087 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591090 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591093 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591095 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591103 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591106 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591109 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591112 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591115 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591117 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591120 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591123 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591126 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:13.591175 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591128 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591131 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591134 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591137 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591140 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591143 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591145 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591148 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591150 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591153 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591155 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591158 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591161 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591164 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591167 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591170 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591172 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591175 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591178 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591180 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:13.591661 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591183 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591186 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591189 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591192 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591199 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591205 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591211 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591214 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591217 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591220 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591223 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591226 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591228 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591231 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591234 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591236 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591239 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591242 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591245 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:13.592190 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591249 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591253 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591255 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591258 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591261 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591263 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591266 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591269 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591272 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591275 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591277 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591280 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591283 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591285 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591288 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591293 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591296 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591299 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591302 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591306 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:13.592688 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591309 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591312 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591314 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591774 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591781 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591784 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591787 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591789 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591792 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591795 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591797 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591800 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591802 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591805 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591807 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591810 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591812 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591815 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591819 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591822 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:13.593165 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591825 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591828 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591831 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591834 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591836 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591839 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591841 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591845 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591847 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591850 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591852 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591855 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591858 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591860 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591862 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591865 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591868 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591871 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591874 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591876 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:13.593682 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591879 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591881 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591883 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591887 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591889 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591892 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591894 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591897 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591901 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591905 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591908 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591913 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591917 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591920 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591923 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591926 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591929 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591932 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591934 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:13.594178 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591937 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591940 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591943 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591945 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591948 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591950 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591953 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591955 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591958 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591960 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591962 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591965 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591967 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591969 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591972 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591974 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591977 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591980 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591983 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591985 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:13.594675 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591988 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591990 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591993 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591996 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.591999 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592001 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592004 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592006 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592009 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592012 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592092 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592100 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592107 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592111 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592117 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592120 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592125 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592129 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592133 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592136 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592139 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 21:09:13.595169 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592143 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592146 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592149 2568 flags.go:64] FLAG: --cgroup-root="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592152 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592155 2568 flags.go:64] FLAG: --client-ca-file="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592158 2568 flags.go:64] FLAG: --cloud-config="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592161 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592164 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592170 2568 flags.go:64] FLAG: --cluster-domain="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592172 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592176 2568 flags.go:64] FLAG: --config-dir="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592179 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592182 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592187 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592190 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592193 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592198 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592201 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592204 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592207 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592210 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592214 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592218 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592221 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592224 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 21:09:13.595697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592227 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592231 2568 flags.go:64] FLAG: --enable-server="true" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592234 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592238 2568 flags.go:64] FLAG: --event-burst="100" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592241 2568 flags.go:64] FLAG: --event-qps="50" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592244 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592247 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592251 2568 flags.go:64] FLAG: --eviction-hard="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592254 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592257 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592260 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592263 2568 flags.go:64] FLAG: --eviction-soft="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592266 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592269 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592272 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592275 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592278 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592281 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592289 2568 flags.go:64] FLAG: --feature-gates="" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592298 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592301 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592305 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592308 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592312 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592315 2568 flags.go:64] FLAG: --help="false" Apr 22 21:09:13.596344 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592318 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592321 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592324 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592327 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592330 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592336 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592338 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592341 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592344 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592348 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592351 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592354 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592357 2568 flags.go:64] FLAG: --kube-reserved="" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592360 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592363 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592366 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592369 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592372 2568 flags.go:64] FLAG: --lock-file="" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592375 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592378 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592381 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592406 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592410 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592413 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 21:09:13.597013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592416 2568 flags.go:64] FLAG: --logging-format="text" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592420 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592423 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592426 2568 flags.go:64] FLAG: --manifest-url="" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592429 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592434 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592437 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592442 2568 flags.go:64] FLAG: --max-pods="110" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592445 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592448 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592451 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592454 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592456 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592461 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592464 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592471 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592474 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592477 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592482 2568 flags.go:64] FLAG: --pod-cidr="" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592485 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592491 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592493 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592496 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592499 2568 flags.go:64] FLAG: --port="10250" Apr 22 21:09:13.597635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592502 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592505 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06bbfaedfe3748c62" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592508 2568 flags.go:64] FLAG: --qos-reserved="" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592512 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592515 2568 flags.go:64] FLAG: --register-node="true" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592517 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592520 2568 flags.go:64] FLAG: --register-with-taints="" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592524 2568 flags.go:64] FLAG: --registry-burst="10" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592527 2568 flags.go:64] FLAG: --registry-qps="5" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592530 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592534 2568 flags.go:64] FLAG: --reserved-memory="" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592538 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592541 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592544 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592547 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592550 2568 flags.go:64] FLAG: --runonce="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592553 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592556 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592559 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592562 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592565 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592570 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592573 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592576 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592579 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592581 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 21:09:13.598219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592584 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592588 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592591 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592594 2568 flags.go:64] FLAG: --system-cgroups="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592596 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592602 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592605 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592608 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592612 2568 flags.go:64] FLAG: --tls-min-version="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592615 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592618 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592620 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592623 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592626 2568 flags.go:64] FLAG: --v="2" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592631 2568 flags.go:64] FLAG: --version="false" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592635 2568 flags.go:64] FLAG: --vmodule="" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592640 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592644 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592745 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592750 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592752 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592755 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592758 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:13.598935 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592761 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592764 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592766 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592769 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592774 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592776 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592779 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592781 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592784 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592787 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592789 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592793 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592796 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592799 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592801 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592804 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592806 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592809 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592811 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592814 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:13.599516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592816 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592819 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592822 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592824 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592827 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592830 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592833 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592835 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592838 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592841 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592843 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592846 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592849 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592851 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592854 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592857 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592860 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592863 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592866 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:13.600028 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592868 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592871 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592874 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592878 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592881 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592884 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592887 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592890 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592893 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592895 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592897 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592900 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592902 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592905 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592908 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592910 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592913 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592915 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592920 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:13.600556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592922 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592925 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592927 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592930 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592932 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592935 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592938 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592940 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592943 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592945 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592950 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592952 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592955 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592958 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592962 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592964 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592967 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592970 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592972 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592975 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:13.601027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592978 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592980 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.592983 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.592992 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.600405 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.600426 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600479 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600486 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600491 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600495 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600498 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600501 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600504 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600508 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600511 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:13.601562 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600514 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600517 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600520 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600523 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600526 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600529 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600532 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600534 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600537 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600540 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600542 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600545 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600548 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600551 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600554 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600556 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600560 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600563 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600566 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600568 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:13.601945 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600571 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600574 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600578 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600581 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600584 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600587 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600589 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600592 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600594 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600597 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600599 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600602 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600605 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600608 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600610 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600613 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600615 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600618 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600620 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600623 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:13.602513 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600625 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600628 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600630 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600632 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600635 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600638 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600640 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600643 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600645 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600648 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600651 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600653 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600656 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600658 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600661 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600664 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600667 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600669 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600672 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600674 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:13.603018 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600677 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600680 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600682 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600685 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600689 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600692 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600696 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600698 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600701 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600703 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600707 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600710 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600712 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600715 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600718 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600720 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:13.603516 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600723 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.600728 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600851 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600856 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600858 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600861 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600865 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600868 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600872 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600876 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600879 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600882 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600885 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600888 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600890 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:13.603916 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600893 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600895 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600898 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600900 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600903 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600906 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600908 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600911 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600913 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600916 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600919 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600921 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600923 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600926 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600929 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600931 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600934 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600936 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600939 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600941 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:13.604299 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600944 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600946 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600948 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600951 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600954 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600957 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600960 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600963 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600966 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600968 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600971 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600974 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600976 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600979 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600981 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600984 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600987 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600989 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600992 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600994 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:13.604801 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600997 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.600999 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601002 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601004 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601007 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601009 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601012 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601015 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601017 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601019 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601022 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601024 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601026 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601029 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601033 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601036 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601039 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601042 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601045 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:13.605298 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601048 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601051 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601053 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601056 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601058 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601061 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601063 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601065 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601068 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601071 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601073 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601076 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601078 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:13.601080 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.601085 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:13.605797 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.601785 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 21:09:13.606170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.603892 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 21:09:13.606170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.604995 2568 server.go:1019] "Starting client certificate rotation" Apr 22 21:09:13.606170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.605095 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:13.606170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.605139 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:13.635190 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.635161 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:13.637782 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.637756 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:13.651439 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.651410 2568 log.go:25] "Validated CRI v1 runtime API" Apr 22 21:09:13.657052 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.657032 2568 log.go:25] "Validated CRI v1 image API" Apr 22 21:09:13.659783 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.659756 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 21:09:13.663986 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.663959 2568 fs.go:135] Filesystem UUIDs: map[0357ebaf-801b-4778-8cc8-9e56a9e71d22:/dev/nvme0n1p3 1dc20649-7304-4536-95b6-8a0b09eabb23:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 21:09:13.663986 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.663983 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 21:09:13.668933 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.668908 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:13.669259 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669152 2568 manager.go:217] Machine: {Timestamp:2026-04-22 21:09:13.667716943 +0000 UTC m=+0.411765652 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101352 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec25381e660c2b7b8507e02c210af6b8 SystemUUID:ec25381e-660c-2b7b-8507-e02c210af6b8 BootID:bfa38d30-dbe0-421c-9f91-234645c9906c Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:70:62:5d:26:a9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:70:62:5d:26:a9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:16:15:41:4f:01:a0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 21:09:13.669259 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669253 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 21:09:13.669369 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669341 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 21:09:13.669697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669671 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 21:09:13.669842 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669698 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-252.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 21:09:13.669887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669851 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 21:09:13.669887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669861 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 21:09:13.669887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.669878 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:13.670587 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.670575 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:13.671934 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.671922 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:13.672043 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.672034 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 21:09:13.674113 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.674101 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 22 21:09:13.674167 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.674118 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 21:09:13.674167 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.674131 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 21:09:13.674167 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.674149 2568 kubelet.go:397] "Adding apiserver pod source" Apr 22 21:09:13.674167 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.674158 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 21:09:13.675288 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.675272 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:13.675328 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.675299 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:13.681728 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.681707 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 21:09:13.682132 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.682107 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5k8pt" Apr 22 21:09:13.683008 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.682995 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 21:09:13.684486 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684472 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684497 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684504 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684510 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684515 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684521 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684532 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684538 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 21:09:13.684546 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684548 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 21:09:13.684885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684555 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 21:09:13.684885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684576 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 21:09:13.684885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.684585 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 21:09:13.685449 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.685437 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 21:09:13.685449 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.685450 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 21:09:13.686840 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.686816 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-252.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 21:09:13.687002 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.686988 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5k8pt" Apr 22 21:09:13.687123 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.687103 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 21:09:13.689703 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.689687 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 21:09:13.689800 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.689728 2568 server.go:1295] "Started kubelet" Apr 22 21:09:13.689852 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.689797 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 21:09:13.689926 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.689883 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 21:09:13.689968 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.689956 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 21:09:13.690695 ip-10-0-143-252 systemd[1]: Started Kubernetes Kubelet. Apr 22 21:09:13.690884 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.690865 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 22 21:09:13.691696 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.691683 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 21:09:13.697771 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.697747 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:13.698154 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.698132 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 21:09:13.698814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.698793 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 21:09:13.698814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.698794 2568 factory.go:55] Registering systemd factory Apr 22 21:09:13.698943 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.698813 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:13.698943 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.698796 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 21:09:13.699423 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699379 2568 factory.go:223] Registration of the systemd container factory successfully Apr 22 21:09:13.699553 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699503 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 21:09:13.699921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699820 2568 factory.go:153] Registering CRI-O factory Apr 22 21:09:13.699921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699836 2568 factory.go:223] Registration of the crio container factory successfully Apr 22 21:09:13.699921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699896 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 21:09:13.700067 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699925 2568 factory.go:103] Registering Raw factory Apr 22 21:09:13.700067 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.699949 2568 manager.go:1196] Started watching for new ooms in manager Apr 22 21:09:13.701063 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.701045 2568 manager.go:319] Starting recovery of all containers Apr 22 21:09:13.701688 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.701664 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:13.702596 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.702582 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 22 21:09:13.702716 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.702706 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 22 21:09:13.702818 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.702612 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 21:09:13.704499 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.704476 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-252.ec2.internal" not found Apr 22 21:09:13.704591 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.704510 2568 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-143-252.ec2.internal\" not found" node="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.712986 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.712864 2568 manager.go:324] Recovery completed Apr 22 21:09:13.717814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.717775 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.719324 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.719310 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-252.ec2.internal" not found Apr 22 21:09:13.720200 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720186 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.720253 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720220 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.720253 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720245 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.720789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720776 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 21:09:13.720837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720789 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 21:09:13.720837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.720808 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:13.722753 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.722740 2568 policy_none.go:49] "None policy: Start" Apr 22 21:09:13.722801 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.722760 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 21:09:13.722801 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.722781 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 22 21:09:13.764233 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764210 2568 manager.go:341] "Starting Device Plugin manager" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.764287 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764307 2568 server.go:85] "Starting device plugin registration server" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764637 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764662 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764750 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764849 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.764858 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.765405 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 21:09:13.766170 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.765444 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:13.775234 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.775205 2568 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-143-252.ec2.internal" not found Apr 22 21:09:13.833436 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.833400 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 21:09:13.834702 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.834672 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 21:09:13.834702 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.834703 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 21:09:13.834879 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.834726 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 21:09:13.834879 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.834737 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 21:09:13.834879 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.834779 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 21:09:13.836783 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.836761 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:13.865650 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.865614 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.866627 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.866610 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.866739 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.866645 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.866739 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.866660 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.866739 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.866691 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.874115 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.874092 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.874225 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.874123 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-252.ec2.internal\": node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:13.885002 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.884977 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:13.934926 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.934886 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal"] Apr 22 21:09:13.935047 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.935001 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.942898 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.942879 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.943014 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.942917 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.943014 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.942930 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.944083 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944066 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.944222 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944206 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.944269 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944237 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.944868 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944855 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.944920 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944877 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.944920 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944888 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.944990 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944858 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.944990 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944949 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.944990 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.944959 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.945869 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.945852 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.945954 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.945881 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:13.946617 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.946604 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:13.946694 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.946627 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:13.946694 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:13.946636 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:13.966331 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.966304 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-252.ec2.internal\" not found" node="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.970798 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.970777 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-252.ec2.internal\" not found" node="ip-10-0-143-252.ec2.internal" Apr 22 21:09:13.985510 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:13.985478 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.004198 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.004162 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.004198 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.004198 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.004441 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.004216 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65d514ad771afddf9d09f127dfba4f00-config\") pod \"kube-apiserver-proxy-ip-10-0-143-252.ec2.internal\" (UID: \"65d514ad771afddf9d09f127dfba4f00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.086330 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.086299 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.104694 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104673 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.104764 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65d514ad771afddf9d09f127dfba4f00-config\") pod \"kube-apiserver-proxy-ip-10-0-143-252.ec2.internal\" (UID: \"65d514ad771afddf9d09f127dfba4f00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.104764 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.104864 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.104864 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/4cf420de694f9ed50cad7b59988c4ab8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal\" (UID: \"4cf420de694f9ed50cad7b59988c4ab8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.105127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.104882 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/65d514ad771afddf9d09f127dfba4f00-config\") pod \"kube-apiserver-proxy-ip-10-0-143-252.ec2.internal\" (UID: \"65d514ad771afddf9d09f127dfba4f00\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.187175 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.187116 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.268699 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.268628 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.273296 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.273272 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:14.288004 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.287973 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.388551 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.388509 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.489136 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.489103 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.589813 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.589744 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.605108 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.605076 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 21:09:14.605278 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.605242 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:14.605337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.605274 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:14.689097 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.689050 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 21:04:13 +0000 UTC" deadline="2027-12-18 21:54:51.223063085 +0000 UTC" Apr 22 21:09:14.689097 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.689086 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14520h45m36.533979135s" Apr 22 21:09:14.690121 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.690100 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.698289 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.698262 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:14.718226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.718195 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:14.733835 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.733806 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vskmp" Apr 22 21:09:14.740837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.740814 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vskmp" Apr 22 21:09:14.755027 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:14.754981 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf420de694f9ed50cad7b59988c4ab8.slice/crio-f4230dfaee0ddfa9933503265e2dfc074a0deffc256f4b814ccb625f994b91b1 WatchSource:0}: Error finding container f4230dfaee0ddfa9933503265e2dfc074a0deffc256f4b814ccb625f994b91b1: Status 404 returned error can't find the container with id f4230dfaee0ddfa9933503265e2dfc074a0deffc256f4b814ccb625f994b91b1 Apr 22 21:09:14.755691 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:14.755670 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65d514ad771afddf9d09f127dfba4f00.slice/crio-9ae3d84e689827bce924333ff3f4da7986aba6546f0016cfda3c274d876d03bc WatchSource:0}: Error finding container 9ae3d84e689827bce924333ff3f4da7986aba6546f0016cfda3c274d876d03bc: Status 404 returned error can't find the container with id 9ae3d84e689827bce924333ff3f4da7986aba6546f0016cfda3c274d876d03bc Apr 22 21:09:14.759689 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.759669 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:09:14.790793 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.790761 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.838262 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.838208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" event={"ID":"4cf420de694f9ed50cad7b59988c4ab8","Type":"ContainerStarted","Data":"f4230dfaee0ddfa9933503265e2dfc074a0deffc256f4b814ccb625f994b91b1"} Apr 22 21:09:14.838973 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:14.838955 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" event={"ID":"65d514ad771afddf9d09f127dfba4f00","Type":"ContainerStarted","Data":"9ae3d84e689827bce924333ff3f4da7986aba6546f0016cfda3c274d876d03bc"} Apr 22 21:09:14.891215 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.891135 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:14.991699 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:14.991656 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:15.090021 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.089995 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:15.092009 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.091983 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-252.ec2.internal\" not found" Apr 22 21:09:15.125719 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.125693 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:15.198821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.198693 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" Apr 22 21:09:15.209558 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.209527 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:15.210507 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.210477 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" Apr 22 21:09:15.217932 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.217906 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:15.675154 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.675073 2568 apiserver.go:52] "Watching apiserver" Apr 22 21:09:15.682355 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.682328 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 21:09:15.682813 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.682788 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-7znnp","kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq","openshift-dns/node-resolver-2ctrd","openshift-multus/multus-additional-cni-plugins-dr8fz","openshift-multus/multus-b85gf","openshift-multus/network-metrics-daemon-hptqt","openshift-network-diagnostics/network-check-target-s8cvq","openshift-cluster-node-tuning-operator/tuned-rjgp4","openshift-image-registry/node-ca-thbtv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal","openshift-network-operator/iptables-alerter-6z4hz","openshift-ovn-kubernetes/ovnkube-node-j6pw2"] Apr 22 21:09:15.686079 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.686055 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.686179 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.686146 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:15.688299 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.688269 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.690410 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.690368 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.690659 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.690629 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k884x\"" Apr 22 21:09:15.690764 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.690751 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.690838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.690773 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.690838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.690754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 21:09:15.692512 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.692491 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.692645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.692598 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qrsr9\"" Apr 22 21:09:15.692645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.692494 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.695000 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.694982 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.695340 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.695298 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.697796 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.697763 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 21:09:15.697796 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.697777 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 21:09:15.697986 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.697964 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 21:09:15.698125 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.698087 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-29lzd\"" Apr 22 21:09:15.698192 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.698130 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.698657 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.698639 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 21:09:15.698891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.698875 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.699193 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.699178 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtxv4\"" Apr 22 21:09:15.699803 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.699352 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.701222 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.701012 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 21:09:15.701222 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.701016 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 21:09:15.701222 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.701102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8dtqk\"" Apr 22 21:09:15.703497 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.703476 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.703597 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.703553 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.705625 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.705606 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-blwbc\"" Apr 22 21:09:15.705625 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.705620 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.705847 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.705829 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.705995 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.705976 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.706060 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.706022 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 21:09:15.706118 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.706082 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.706583 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.706564 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.706677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.706568 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xjz2h\"" Apr 22 21:09:15.708433 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.708415 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.708566 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.708433 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.708633 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.708432 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.708798 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.708469 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 21:09:15.708934 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.708485 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lcfnh\"" Apr 22 21:09:15.710636 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710616 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 21:09:15.710829 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710812 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 21:09:15.710955 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710823 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 21:09:15.710955 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710951 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:15.711130 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710971 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 21:09:15.711130 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.711018 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:15.711130 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710845 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hnktj\"" Apr 22 21:09:15.711130 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.710874 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 21:09:15.712288 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712268 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 21:09:15.712884 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712863 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-kubernetes\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.712977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-sys\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.712977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712930 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-registration-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.712977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712953 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-cnibin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.712977 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-hostroot\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-conf-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04aad4f5-cc85-41af-a470-1fa752e56411-hosts-file\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.713134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713082 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-modprobe-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713107 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-systemd\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713162 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-os-release\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713178 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5xbk\" (UniqueName: \"kubernetes.io/projected/d5843d3e-a9c1-40f4-918c-77998582dbee-kube-api-access-q5xbk\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04aad4f5-cc85-41af-a470-1fa752e56411-tmp-dir\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysconfig\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713316 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-k8s-cni-cncf-io\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-bin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d04c13d-b019-4af4-9237-79c3ecb7fde8-agent-certs\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713406 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gkn4\" (UniqueName: \"kubernetes.io/projected/04aad4f5-cc85-41af-a470-1fa752e56411-kube-api-access-8gkn4\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713491 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713519 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713548 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-host\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvx2l\" (UniqueName: \"kubernetes.io/projected/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-kube-api-access-xvx2l\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713597 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.713629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713623 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713668 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-var-lib-kubelet\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713697 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713740 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-socket-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-sys-fs\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713791 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-system-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713819 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713842 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-cni-binary-copy\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713886 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-system-cni-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713916 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-tuned\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713940 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-tmp\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.713971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713964 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-socket-dir-parent\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.713988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-multus\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d04c13d-b019-4af4-9237-79c3ecb7fde8-konnectivity-ca\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-cnibin\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d78k\" (UniqueName: \"kubernetes.io/projected/f02b6849-41ba-4491-9415-4a546ba5e3bb-kube-api-access-8d78k\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714101 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-conf\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-run\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714166 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwb7v\" (UniqueName: \"kubernetes.io/projected/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-kube-api-access-hwb7v\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-device-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714222 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-multus-certs\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714302 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-etc-kubernetes\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-lib-modules\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-etc-selinux\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.714429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714416 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhq2l\" (UniqueName: \"kubernetes.io/projected/caf7fceb-c589-4214-bb3b-008091f205e1-kube-api-access-rhq2l\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.714928 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-netns\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714928 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-kubelet\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714928 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-daemon-config\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.714928 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.714550 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-os-release\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.741938 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.741901 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:14 +0000 UTC" deadline="2027-11-09 04:37:42.504223371 +0000 UTC" Apr 22 21:09:15.741938 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.741937 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13567h28m26.762290544s" Apr 22 21:09:15.800109 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.800077 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 21:09:15.814956 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.814924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04aad4f5-cc85-41af-a470-1fa752e56411-tmp-dir\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.814956 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.814962 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-k8s-cni-cncf-io\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.814980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-bin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815040 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-k8s-cni-cncf-io\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d04c13d-b019-4af4-9237-79c3ecb7fde8-agent-certs\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2309c72-e3a9-40b3-a52e-4926e4f1b291-host-slash\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815110 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-config\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gkn4\" (UniqueName: \"kubernetes.io/projected/04aad4f5-cc85-41af-a470-1fa752e56411-kube-api-access-8gkn4\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.815187 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815260 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-bin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815311 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-systemd-units\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-var-lib-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/04aad4f5-cc85-41af-a470-1fa752e56411-tmp-dir\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-var-lib-kubelet\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815469 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-socket-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d04c13d-b019-4af4-9237-79c3ecb7fde8-konnectivity-ca\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7b7b\" (UniqueName: \"kubernetes.io/projected/91e31d0a-9404-4d86-a9a6-1f28187dbd99-kube-api-access-t7b7b\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815526 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815536 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-var-lib-kubelet\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815552 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-script-lib\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815526 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.815615 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-tuned\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815643 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-socket-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-tmp\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-multus\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-cni-multus\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815831 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-netns\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-node-log\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-cnibin\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815905 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d78k\" (UniqueName: \"kubernetes.io/projected/f02b6849-41ba-4491-9415-4a546ba5e3bb-kube-api-access-8d78k\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815931 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-conf\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwb7v\" (UniqueName: \"kubernetes.io/projected/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-kube-api-access-hwb7v\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-cnibin\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.815985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-multus-certs\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wkg\" (UniqueName: \"kubernetes.io/projected/7024fa13-11c9-4df5-bb63-12212dd14ff1-kube-api-access-26wkg\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816033 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhq2l\" (UniqueName: \"kubernetes.io/projected/caf7fceb-c589-4214-bb3b-008091f205e1-kube-api-access-rhq2l\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.816220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-kubelet\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-multus-certs\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-conf\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d04c13d-b019-4af4-9237-79c3ecb7fde8-konnectivity-ca\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-conf-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-systemd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816182 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-var-lib-kubelet\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-log-socket\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816214 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04aad4f5-cc85-41af-a470-1fa752e56411-hosts-file\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816225 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-conf-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816238 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-kubelet\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysconfig\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816285 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-bin\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816303 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/04aad4f5-cc85-41af-a470-1fa752e56411-hosts-file\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816330 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-sys-fs\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.817033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816342 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysconfig\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816352 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-cni-binary-copy\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816383 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-etc-kubernetes\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-sys-fs\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816433 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-etc-kubernetes\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-host\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvx2l\" (UniqueName: \"kubernetes.io/projected/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-kube-api-access-xvx2l\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816497 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816522 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-system-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-host\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816618 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91e31d0a-9404-4d86-a9a6-1f28187dbd99-host\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.816661 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.816748 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.316713647 +0000 UTC m=+3.060762356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816814 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.817838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816858 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-system-cni-dir\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816911 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816912 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f02b6849-41ba-4491-9415-4a546ba5e3bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.816987 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-ovn\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-system-cni-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-socket-dir-parent\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817109 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ltq\" (UniqueName: \"kubernetes.io/projected/c2309c72-e3a9-40b3-a52e-4926e4f1b291-kube-api-access-q7ltq\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-netd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817182 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-env-overrides\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817183 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-system-cni-dir\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-socket-dir-parent\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817316 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-run\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817348 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-device-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817364 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-run\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817376 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e31d0a-9404-4d86-a9a6-1f28187dbd99-serviceca\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.818655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2309c72-e3a9-40b3-a52e-4926e4f1b291-iptables-alerter-script\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-device-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817467 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-slash\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817506 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovn-node-metrics-cert\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817518 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-cni-binary-copy\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817539 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-lib-modules\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-etc-selinux\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-netns\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817625 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-lib-modules\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-daemon-config\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-host-run-netns\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-etc-selinux\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-os-release\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817788 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-kubernetes\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-sys\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f02b6849-41ba-4491-9415-4a546ba5e3bb-os-release\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-registration-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.819276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817900 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-sys\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/caf7fceb-c589-4214-bb3b-008091f205e1-registration-dir\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817938 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-kubernetes\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817947 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-cnibin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.817979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-hostroot\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818004 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-modprobe-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-cnibin\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818026 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818060 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-systemd\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818088 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-os-release\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-sysctl-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5xbk\" (UniqueName: \"kubernetes.io/projected/d5843d3e-a9c1-40f4-918c-77998582dbee-kube-api-access-q5xbk\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818155 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-hostroot\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-etc-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d5843d3e-a9c1-40f4-918c-77998582dbee-multus-daemon-config\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-modprobe-d\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818283 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-systemd\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.818288 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d5843d3e-a9c1-40f4-918c-77998582dbee-os-release\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.819910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.819190 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-etc-tuned\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.820547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.819261 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-tmp\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.820547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.819808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d04c13d-b019-4af4-9237-79c3ecb7fde8-agent-certs\") pod \"konnectivity-agent-7znnp\" (UID: \"2d04c13d-b019-4af4-9237-79c3ecb7fde8\") " pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:15.822995 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.822928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gkn4\" (UniqueName: \"kubernetes.io/projected/04aad4f5-cc85-41af-a470-1fa752e56411-kube-api-access-8gkn4\") pod \"node-resolver-2ctrd\" (UID: \"04aad4f5-cc85-41af-a470-1fa752e56411\") " pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:15.823717 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.823654 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwb7v\" (UniqueName: \"kubernetes.io/projected/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-kube-api-access-hwb7v\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:15.824134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.824085 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhq2l\" (UniqueName: \"kubernetes.io/projected/caf7fceb-c589-4214-bb3b-008091f205e1-kube-api-access-rhq2l\") pod \"aws-ebs-csi-driver-node-62vnq\" (UID: \"caf7fceb-c589-4214-bb3b-008091f205e1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:15.824235 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.824221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d78k\" (UniqueName: \"kubernetes.io/projected/f02b6849-41ba-4491-9415-4a546ba5e3bb-kube-api-access-8d78k\") pod \"multus-additional-cni-plugins-dr8fz\" (UID: \"f02b6849-41ba-4491-9415-4a546ba5e3bb\") " pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:15.825456 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.825438 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvx2l\" (UniqueName: \"kubernetes.io/projected/9a3dee73-dc36-4a5a-8f7f-0ba205d2534b-kube-api-access-xvx2l\") pod \"tuned-rjgp4\" (UID: \"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b\") " pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:15.825601 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.825583 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5xbk\" (UniqueName: \"kubernetes.io/projected/d5843d3e-a9c1-40f4-918c-77998582dbee-kube-api-access-q5xbk\") pod \"multus-b85gf\" (UID: \"d5843d3e-a9c1-40f4-918c-77998582dbee\") " pod="openshift-multus/multus-b85gf" Apr 22 21:09:15.918560 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918526 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91e31d0a-9404-4d86-a9a6-1f28187dbd99-host\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.918560 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:15.918781 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918638 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/91e31d0a-9404-4d86-a9a6-1f28187dbd99-host\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.918781 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-ovn\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918781 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7ltq\" (UniqueName: \"kubernetes.io/projected/c2309c72-e3a9-40b3-a52e-4926e4f1b291-kube-api-access-q7ltq\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.918781 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918787 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-netd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918821 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918823 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-env-overrides\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918786 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-ovn\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e31d0a-9404-4d86-a9a6-1f28187dbd99-serviceca\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.918967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-netd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.918969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2309c72-e3a9-40b3-a52e-4926e4f1b291-iptables-alerter-script\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919044 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-slash\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919082 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovn-node-metrics-cert\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-etc-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2309c72-e3a9-40b3-a52e-4926e4f1b291-host-slash\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-config\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919204 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-systemd-units\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-var-lib-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919258 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7b7b\" (UniqueName: \"kubernetes.io/projected/91e31d0a-9404-4d86-a9a6-1f28187dbd99-kube-api-access-t7b7b\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919283 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-env-overrides\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-etc-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919287 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-script-lib\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-netns\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919365 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-node-log\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26wkg\" (UniqueName: \"kubernetes.io/projected/7024fa13-11c9-4df5-bb63-12212dd14ff1-kube-api-access-26wkg\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-systemd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919481 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c2309c72-e3a9-40b3-a52e-4926e4f1b291-iptables-alerter-script\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919488 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-log-socket\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-kubelet\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919556 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-bin\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-netns\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919621 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-node-log\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.919669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919662 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-run-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919666 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-run-systemd\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919694 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-var-lib-openvswitch\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-kubelet\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919521 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-log-socket\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-systemd-units\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-cni-bin\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919782 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-config\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919809 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2309c72-e3a9-40b3-a52e-4926e4f1b291-host-slash\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7024fa13-11c9-4df5-bb63-12212dd14ff1-host-slash\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.919980 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/91e31d0a-9404-4d86-a9a6-1f28187dbd99-serviceca\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:15.920330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.920264 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovnkube-script-lib\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.921948 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.921925 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7024fa13-11c9-4df5-bb63-12212dd14ff1-ovn-node-metrics-cert\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.924688 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.924669 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:15.924800 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.924695 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:15.924800 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.924709 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:15.924800 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:15.924768 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:16.424750292 +0000 UTC m=+3.168799007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:15.926675 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.926620 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:15.926935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.926913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7ltq\" (UniqueName: \"kubernetes.io/projected/c2309c72-e3a9-40b3-a52e-4926e4f1b291-kube-api-access-q7ltq\") pod \"iptables-alerter-6z4hz\" (UID: \"c2309c72-e3a9-40b3-a52e-4926e4f1b291\") " pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:15.927192 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.927172 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wkg\" (UniqueName: \"kubernetes.io/projected/7024fa13-11c9-4df5-bb63-12212dd14ff1-kube-api-access-26wkg\") pod \"ovnkube-node-j6pw2\" (UID: \"7024fa13-11c9-4df5-bb63-12212dd14ff1\") " pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:15.927627 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:15.927610 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7b7b\" (UniqueName: \"kubernetes.io/projected/91e31d0a-9404-4d86-a9a6-1f28187dbd99-kube-api-access-t7b7b\") pod \"node-ca-thbtv\" (UID: \"91e31d0a-9404-4d86-a9a6-1f28187dbd99\") " pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:16.001919 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.001883 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" Apr 22 21:09:16.007468 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.007441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2ctrd" Apr 22 21:09:16.016117 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.016087 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" Apr 22 21:09:16.022913 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.022886 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b85gf" Apr 22 21:09:16.029601 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.029571 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:16.037351 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.037326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" Apr 22 21:09:16.043056 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.043025 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-thbtv" Apr 22 21:09:16.050714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.050689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6z4hz" Apr 22 21:09:16.055714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.055693 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:16.321981 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.321900 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:16.322151 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.322083 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:16.322213 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.322157 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:17.322133402 +0000 UTC m=+4.066182079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:16.437020 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.436986 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf02b6849_41ba_4491_9415_4a546ba5e3bb.slice/crio-df6a76d3dc85c29a1abd2f07ddee5c3797b634307b5c69c005db74ea9b7eadb3 WatchSource:0}: Error finding container df6a76d3dc85c29a1abd2f07ddee5c3797b634307b5c69c005db74ea9b7eadb3: Status 404 returned error can't find the container with id df6a76d3dc85c29a1abd2f07ddee5c3797b634307b5c69c005db74ea9b7eadb3 Apr 22 21:09:16.439089 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.439063 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5843d3e_a9c1_40f4_918c_77998582dbee.slice/crio-fc5f7dbfbeb8d779f9537397a6804d2fdd24d873dd441a766d4c5bbc2a00d9b4 WatchSource:0}: Error finding container fc5f7dbfbeb8d779f9537397a6804d2fdd24d873dd441a766d4c5bbc2a00d9b4: Status 404 returned error can't find the container with id fc5f7dbfbeb8d779f9537397a6804d2fdd24d873dd441a766d4c5bbc2a00d9b4 Apr 22 21:09:16.441955 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.441933 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d04c13d_b019_4af4_9237_79c3ecb7fde8.slice/crio-d64d68957322ee9eb618cc387e8a8e282f00ee76ecb5eb686864442954d32bea WatchSource:0}: Error finding container d64d68957322ee9eb618cc387e8a8e282f00ee76ecb5eb686864442954d32bea: Status 404 returned error can't find the container with id d64d68957322ee9eb618cc387e8a8e282f00ee76ecb5eb686864442954d32bea Apr 22 21:09:16.442776 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.442750 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e31d0a_9404_4d86_a9a6_1f28187dbd99.slice/crio-f32749e4bb574c311059e07049f53780b5b2fd4118d795d890377431e4f9345f WatchSource:0}: Error finding container f32749e4bb574c311059e07049f53780b5b2fd4118d795d890377431e4f9345f: Status 404 returned error can't find the container with id f32749e4bb574c311059e07049f53780b5b2fd4118d795d890377431e4f9345f Apr 22 21:09:16.443631 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.443609 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf7fceb_c589_4214_bb3b_008091f205e1.slice/crio-e3f913cfb59dafaa6c2e948e723170915bd7989555a4bfa770fa7b278c3361a7 WatchSource:0}: Error finding container e3f913cfb59dafaa6c2e948e723170915bd7989555a4bfa770fa7b278c3361a7: Status 404 returned error can't find the container with id e3f913cfb59dafaa6c2e948e723170915bd7989555a4bfa770fa7b278c3361a7 Apr 22 21:09:16.444656 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.444562 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3dee73_dc36_4a5a_8f7f_0ba205d2534b.slice/crio-6a7a902c64c190e7ed175036a4127ce875e8f58bd3119bc23d6a7223bd871433 WatchSource:0}: Error finding container 6a7a902c64c190e7ed175036a4127ce875e8f58bd3119bc23d6a7223bd871433: Status 404 returned error can't find the container with id 6a7a902c64c190e7ed175036a4127ce875e8f58bd3119bc23d6a7223bd871433 Apr 22 21:09:16.445681 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.445659 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7024fa13_11c9_4df5_bb63_12212dd14ff1.slice/crio-d21526e5ecdf428a808cf3a961962c3e2bde8debe4c11776b3cc3a52de6363ea WatchSource:0}: Error finding container d21526e5ecdf428a808cf3a961962c3e2bde8debe4c11776b3cc3a52de6363ea: Status 404 returned error can't find the container with id d21526e5ecdf428a808cf3a961962c3e2bde8debe4c11776b3cc3a52de6363ea Apr 22 21:09:16.446892 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.446855 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2309c72_e3a9_40b3_a52e_4926e4f1b291.slice/crio-dc51dcfb5751188288f929eeccf910d7b46360f15bb2038aa149e13cae444d49 WatchSource:0}: Error finding container dc51dcfb5751188288f929eeccf910d7b46360f15bb2038aa149e13cae444d49: Status 404 returned error can't find the container with id dc51dcfb5751188288f929eeccf910d7b46360f15bb2038aa149e13cae444d49 Apr 22 21:09:16.448492 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:16.448466 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04aad4f5_cc85_41af_a470_1fa752e56411.slice/crio-21f2c0166d643c2f102fa6bbfb9b192b87bb7dce2b68a572df76fbcc83557948 WatchSource:0}: Error finding container 21f2c0166d643c2f102fa6bbfb9b192b87bb7dce2b68a572df76fbcc83557948: Status 404 returned error can't find the container with id 21f2c0166d643c2f102fa6bbfb9b192b87bb7dce2b68a572df76fbcc83557948 Apr 22 21:09:16.523422 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.523250 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:16.523422 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.523413 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:16.523601 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.523435 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:16.523601 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.523446 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:16.523601 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.523498 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:17.523484258 +0000 UTC m=+4.267532935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:16.742673 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.742564 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:14 +0000 UTC" deadline="2028-02-05 08:57:03.870360894 +0000 UTC" Apr 22 21:09:16.742673 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.742597 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15683h47m47.12776628s" Apr 22 21:09:16.835711 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.835406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:16.835711 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:16.835533 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:16.847280 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.847214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ctrd" event={"ID":"04aad4f5-cc85-41af-a470-1fa752e56411","Type":"ContainerStarted","Data":"21f2c0166d643c2f102fa6bbfb9b192b87bb7dce2b68a572df76fbcc83557948"} Apr 22 21:09:16.850745 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.850674 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"d21526e5ecdf428a808cf3a961962c3e2bde8debe4c11776b3cc3a52de6363ea"} Apr 22 21:09:16.854628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.854567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" event={"ID":"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b","Type":"ContainerStarted","Data":"6a7a902c64c190e7ed175036a4127ce875e8f58bd3119bc23d6a7223bd871433"} Apr 22 21:09:16.862283 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.862230 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" event={"ID":"caf7fceb-c589-4214-bb3b-008091f205e1","Type":"ContainerStarted","Data":"e3f913cfb59dafaa6c2e948e723170915bd7989555a4bfa770fa7b278c3361a7"} Apr 22 21:09:16.865662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.865630 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-thbtv" event={"ID":"91e31d0a-9404-4d86-a9a6-1f28187dbd99","Type":"ContainerStarted","Data":"f32749e4bb574c311059e07049f53780b5b2fd4118d795d890377431e4f9345f"} Apr 22 21:09:16.871921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.871163 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" event={"ID":"65d514ad771afddf9d09f127dfba4f00","Type":"ContainerStarted","Data":"4e650277ab1cabb1965d8e7c3c76bdfc22d3cf21c90edeb624b7d55a2bbf8cb6"} Apr 22 21:09:16.878662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.878627 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6z4hz" event={"ID":"c2309c72-e3a9-40b3-a52e-4926e4f1b291","Type":"ContainerStarted","Data":"dc51dcfb5751188288f929eeccf910d7b46360f15bb2038aa149e13cae444d49"} Apr 22 21:09:16.880431 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.880404 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7znnp" event={"ID":"2d04c13d-b019-4af4-9237-79c3ecb7fde8","Type":"ContainerStarted","Data":"d64d68957322ee9eb618cc387e8a8e282f00ee76ecb5eb686864442954d32bea"} Apr 22 21:09:16.888412 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.888265 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b85gf" event={"ID":"d5843d3e-a9c1-40f4-918c-77998582dbee","Type":"ContainerStarted","Data":"fc5f7dbfbeb8d779f9537397a6804d2fdd24d873dd441a766d4c5bbc2a00d9b4"} Apr 22 21:09:16.893826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.893689 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-252.ec2.internal" podStartSLOduration=1.8936728779999998 podStartE2EDuration="1.893672878s" podCreationTimestamp="2026-04-22 21:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:16.893444421 +0000 UTC m=+3.637493121" watchObservedRunningTime="2026-04-22 21:09:16.893672878 +0000 UTC m=+3.637721609" Apr 22 21:09:16.900725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:16.900688 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerStarted","Data":"df6a76d3dc85c29a1abd2f07ddee5c3797b634307b5c69c005db74ea9b7eadb3"} Apr 22 21:09:17.334062 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:17.333134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:17.334062 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.333359 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:17.334062 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.333439 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:19.333420712 +0000 UTC m=+6.077469389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:17.535430 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:17.535299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:17.536241 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.535769 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:17.536241 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.535802 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:17.536241 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.535816 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:17.536241 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.535879 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:19.535858994 +0000 UTC m=+6.279907677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:17.838889 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:17.836023 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:17.838889 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:17.836176 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:17.919066 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:17.918980 2568 generic.go:358] "Generic (PLEG): container finished" podID="4cf420de694f9ed50cad7b59988c4ab8" containerID="3f7d8381f29dc486e3793d028f3bcd15754d2599243fa24317ca3a80a6bd7ab9" exitCode=0 Apr 22 21:09:17.919228 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:17.919093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" event={"ID":"4cf420de694f9ed50cad7b59988c4ab8","Type":"ContainerDied","Data":"3f7d8381f29dc486e3793d028f3bcd15754d2599243fa24317ca3a80a6bd7ab9"} Apr 22 21:09:18.835882 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:18.835841 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:18.836075 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:18.835974 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:18.931216 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:18.931179 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" event={"ID":"4cf420de694f9ed50cad7b59988c4ab8","Type":"ContainerStarted","Data":"33d0ca165c5e78fb6aa13b6fa4857afcb22c99f96cb10021bf5b3973e7a77a16"} Apr 22 21:09:19.352127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:19.352084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:19.352446 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.352299 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:19.352446 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.352365 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:23.35234566 +0000 UTC m=+10.096394349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:19.554755 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:19.554722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:19.554922 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.554885 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:19.554922 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.554902 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:19.554922 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.554912 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:19.555061 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.554960 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:23.554944696 +0000 UTC m=+10.298993374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:19.835972 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:19.835895 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:19.836109 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:19.836039 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:20.020277 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.020206 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-252.ec2.internal" podStartSLOduration=5.020181346 podStartE2EDuration="5.020181346s" podCreationTimestamp="2026-04-22 21:09:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:18.943924905 +0000 UTC m=+5.687973605" watchObservedRunningTime="2026-04-22 21:09:20.020181346 +0000 UTC m=+6.764230046" Apr 22 21:09:20.020736 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.020688 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gkkxr"] Apr 22 21:09:20.023743 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.023716 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.023893 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.023807 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:20.059987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.059833 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-kubelet-config\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.059987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.059897 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-dbus\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.059987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.059951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.161570 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.161476 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-kubelet-config\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.161570 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.161528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-dbus\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.161796 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.161574 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.161796 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.161645 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-kubelet-config\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.161796 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.161689 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:20.161796 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.161759 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:20.661732189 +0000 UTC m=+7.405780872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:20.162003 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.161795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62f5ce17-e153-446d-9866-1da5180f3d9a-dbus\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.666331 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.666298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:20.666537 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.666496 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:20.666601 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.666565 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:21.666545872 +0000 UTC m=+8.410594555 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:20.835900 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:20.835773 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:20.836069 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:20.835911 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:21.675070 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:21.675020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:21.675635 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:21.675210 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:21.675635 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:21.675280 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:23.6752624 +0000 UTC m=+10.419311080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:21.835323 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:21.835283 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:21.835508 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:21.835282 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:21.835508 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:21.835459 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:21.835701 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:21.835672 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:22.835382 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:22.835343 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:22.835846 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:22.835517 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:23.389550 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:23.389444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:23.389752 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.389596 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:23.389752 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.389683 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:31.389660735 +0000 UTC m=+18.133709433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:23.591259 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:23.591220 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:23.591505 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.591426 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:23.591505 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.591446 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:23.591505 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.591459 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:23.591680 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.591523 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:31.591504561 +0000 UTC m=+18.335553237 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:23.692461 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:23.692286 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:23.692461 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.692455 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:23.692687 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.692526 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:27.692506849 +0000 UTC m=+14.436555537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:23.836527 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:23.836475 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:23.836940 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.836645 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:23.837207 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:23.837047 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:23.837207 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:23.837162 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:24.835237 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:24.835201 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:24.835450 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:24.835321 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:25.835404 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:25.835294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:25.835826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:25.835301 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:25.835826 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:25.835448 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:25.835826 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:25.835522 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:26.835256 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:26.835215 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:26.835476 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:26.835343 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:27.721523 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:27.721485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:27.721724 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:27.721611 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:27.721724 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:27.721685 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:35.721666249 +0000 UTC m=+22.465714929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:27.835697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:27.835658 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:27.836126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:27.835674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:27.836126 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:27.835808 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:27.836126 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:27.835910 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:28.835010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:28.834974 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:28.835272 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:28.835089 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:29.835409 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:29.835360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:29.835877 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:29.835364 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:29.835877 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:29.835518 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:29.835877 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:29.835614 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:30.835239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:30.835197 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:30.835440 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:30.835339 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:31.449657 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:31.449615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:31.449827 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.449735 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:31.449827 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.449799 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.449784155 +0000 UTC m=+34.193832837 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:31.651181 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:31.651140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:31.651360 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.651328 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:31.651360 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.651354 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:31.651479 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.651364 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:31.651479 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.651442 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.651420801 +0000 UTC m=+34.395469480 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:31.835736 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:31.835655 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:31.836123 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:31.835661 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:31.836123 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.835804 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:31.836123 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:31.835914 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:32.835797 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:32.835743 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:32.836263 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:32.835894 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:33.836470 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.835862 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:33.836470 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:33.836247 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:33.837343 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.837185 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:33.837343 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:33.837296 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:33.961280 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.961243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b85gf" event={"ID":"d5843d3e-a9c1-40f4-918c-77998582dbee","Type":"ContainerStarted","Data":"f79b41d4dad859367c4548feaac4058b666933fb6d30ca612d143be0f3dd25ca"} Apr 22 21:09:33.962533 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.962507 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="18b2495ac91cecca94a5826373c4e49a941f0ede0a165cc94e75d5689f081cc2" exitCode=0 Apr 22 21:09:33.962638 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.962573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"18b2495ac91cecca94a5826373c4e49a941f0ede0a165cc94e75d5689f081cc2"} Apr 22 21:09:33.963815 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.963759 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2ctrd" event={"ID":"04aad4f5-cc85-41af-a470-1fa752e56411","Type":"ContainerStarted","Data":"ffdfa87d5a8497fdf331f73c0b15c6af7701d637be48f823fc67c0750f11f3b5"} Apr 22 21:09:33.965986 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.965969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:09:33.966283 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"6a1f891c652e49f0f7ec655a2a90f59c2635b3799b1fbea64e28ebe86cc639ca"} Apr 22 21:09:33.966357 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966292 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"3a2ddd1b185416e205db88733a1f6c09d0797a4fdef906a16de41d7948b6f016"} Apr 22 21:09:33.966357 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966303 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"6705305dab1f039902dd359db377c3954fd9927f744f205a870074e8aa7d5861"} Apr 22 21:09:33.966357 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerDied","Data":"2975936029a9e254b9e709df308ae5e9a39e0a72d278635a8f26fcf4c6d7e746"} Apr 22 21:09:33.966357 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966249 2568 generic.go:358] "Generic (PLEG): container finished" podID="7024fa13-11c9-4df5-bb63-12212dd14ff1" containerID="2975936029a9e254b9e709df308ae5e9a39e0a72d278635a8f26fcf4c6d7e746" exitCode=1 Apr 22 21:09:33.966563 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.966371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"b70079ce60fc2b1df3638bb94fbc5d50326ea129670c1b50744b36e08afabe8e"} Apr 22 21:09:33.967677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.967647 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" event={"ID":"9a3dee73-dc36-4a5a-8f7f-0ba205d2534b","Type":"ContainerStarted","Data":"4a1a5e1d6ac09c5f7e18299c6217b5aacad47992b22a27747b6460b5595a2691"} Apr 22 21:09:33.970350 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.970326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" event={"ID":"caf7fceb-c589-4214-bb3b-008091f205e1","Type":"ContainerStarted","Data":"f6486d78f05793be9136508bd7aa94e50aa11059e8631fe6be08f464d3715ab4"} Apr 22 21:09:33.971742 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.971723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-thbtv" event={"ID":"91e31d0a-9404-4d86-a9a6-1f28187dbd99","Type":"ContainerStarted","Data":"d65521bc862ec15fb81d2b2ae910121cb16fd8cccbb849c09a1fd390f3e5015a"} Apr 22 21:09:33.973013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.972993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7znnp" event={"ID":"2d04c13d-b019-4af4-9237-79c3ecb7fde8","Type":"ContainerStarted","Data":"faa3f84d677cc9699bad0836347a4e011c1e3189055c4ccb803fc190a98267de"} Apr 22 21:09:33.979886 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.979141 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b85gf" podStartSLOduration=4.110079483 podStartE2EDuration="20.979113239s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.440557626 +0000 UTC m=+3.184606303" lastFinishedPulling="2026-04-22 21:09:33.309591369 +0000 UTC m=+20.053640059" observedRunningTime="2026-04-22 21:09:33.976541945 +0000 UTC m=+20.720590645" watchObservedRunningTime="2026-04-22 21:09:33.979113239 +0000 UTC m=+20.723161920" Apr 22 21:09:33.989759 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:33.989718 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2ctrd" podStartSLOduration=4.142421511 podStartE2EDuration="20.989705499s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.450146461 +0000 UTC m=+3.194195145" lastFinishedPulling="2026-04-22 21:09:33.297430447 +0000 UTC m=+20.041479133" observedRunningTime="2026-04-22 21:09:33.989616966 +0000 UTC m=+20.733665664" watchObservedRunningTime="2026-04-22 21:09:33.989705499 +0000 UTC m=+20.733754197" Apr 22 21:09:34.002185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.002140 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-thbtv" podStartSLOduration=11.885765641 podStartE2EDuration="21.002125961s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.444931059 +0000 UTC m=+3.188979740" lastFinishedPulling="2026-04-22 21:09:25.561291374 +0000 UTC m=+12.305340060" observedRunningTime="2026-04-22 21:09:34.001858004 +0000 UTC m=+20.745906732" watchObservedRunningTime="2026-04-22 21:09:34.002125961 +0000 UTC m=+20.746174661" Apr 22 21:09:34.043102 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.043044 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7znnp" podStartSLOduration=4.189244644 podStartE2EDuration="21.043027827s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.443539785 +0000 UTC m=+3.187588476" lastFinishedPulling="2026-04-22 21:09:33.297322969 +0000 UTC m=+20.041371659" observedRunningTime="2026-04-22 21:09:34.042486307 +0000 UTC m=+20.786535009" watchObservedRunningTime="2026-04-22 21:09:34.043027827 +0000 UTC m=+20.787076525" Apr 22 21:09:34.061010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.060963 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rjgp4" podStartSLOduration=4.197462053 podStartE2EDuration="21.060944345s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.446349293 +0000 UTC m=+3.190397972" lastFinishedPulling="2026-04-22 21:09:33.309831573 +0000 UTC m=+20.053880264" observedRunningTime="2026-04-22 21:09:34.060320996 +0000 UTC m=+20.804369691" watchObservedRunningTime="2026-04-22 21:09:34.060944345 +0000 UTC m=+20.804993045" Apr 22 21:09:34.835977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.835945 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:34.836134 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:34.836060 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:34.865548 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.865516 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 21:09:34.977978 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.977893 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:09:34.978314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.978286 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"26e0bc3a1f4c0352d0942ed633848e913cc898dbd57d9ca7b660aec3f2507f02"} Apr 22 21:09:34.979969 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.979947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" event={"ID":"caf7fceb-c589-4214-bb3b-008091f205e1","Type":"ContainerStarted","Data":"85d622ad0c993b6e85fe7701ced047c247d134f3b2d5a46818656f39e2fc4004"} Apr 22 21:09:34.981448 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.981406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6z4hz" event={"ID":"c2309c72-e3a9-40b3-a52e-4926e4f1b291","Type":"ContainerStarted","Data":"b964b5a83af1cde35a6262bd7b1810284dd9fee04b761d45809bb7c33585b405"} Apr 22 21:09:34.994622 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:34.994576 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6z4hz" podStartSLOduration=5.220328214 podStartE2EDuration="21.994555895s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.44920751 +0000 UTC m=+3.193256187" lastFinishedPulling="2026-04-22 21:09:33.223435186 +0000 UTC m=+19.967483868" observedRunningTime="2026-04-22 21:09:34.993922375 +0000 UTC m=+21.737971074" watchObservedRunningTime="2026-04-22 21:09:34.994555895 +0000 UTC m=+21.738604594" Apr 22 21:09:35.777334 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.777203 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T21:09:34.865541942Z","UUID":"85ba2572-f882-46de-be7d-9f2a518bf897","Handler":null,"Name":"","Endpoint":""} Apr 22 21:09:35.780166 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.780137 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 21:09:35.780309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.780181 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 21:09:35.780309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.780283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:35.780509 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:35.780485 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:35.780642 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:35.780559 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret podName:62f5ce17-e153-446d-9866-1da5180f3d9a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:51.78053811 +0000 UTC m=+38.524586801 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret") pod "global-pull-secret-syncer-gkkxr" (UID: "62f5ce17-e153-446d-9866-1da5180f3d9a") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:35.836219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.836180 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:35.836452 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:35.836328 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:35.836452 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.836406 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:35.836587 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:35.836521 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:35.985720 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:35.985686 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" event={"ID":"caf7fceb-c589-4214-bb3b-008091f205e1","Type":"ContainerStarted","Data":"fc1c96d2669591c4abb0ff6f7694b735793147782d37b6bd19b1dc11664d3324"} Apr 22 21:09:36.001848 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:36.001791 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-62vnq" podStartSLOduration=3.6539084170000002 podStartE2EDuration="23.001772596s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.446060218 +0000 UTC m=+3.190108895" lastFinishedPulling="2026-04-22 21:09:35.793924385 +0000 UTC m=+22.537973074" observedRunningTime="2026-04-22 21:09:36.001311244 +0000 UTC m=+22.745359944" watchObservedRunningTime="2026-04-22 21:09:36.001772596 +0000 UTC m=+22.745821298" Apr 22 21:09:36.835846 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:36.835651 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:36.836001 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:36.835932 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:36.990041 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:36.989999 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:09:36.990508 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:36.990430 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"3bb0f5cb1214ad2dbbeb2eb008aa9200561efd3447fecfa84571286a3fbb3930"} Apr 22 21:09:37.583613 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.583573 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:37.584279 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.584255 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:37.835758 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.835676 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:37.835958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.835676 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:37.835958 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:37.835806 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:37.835958 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:37.835928 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:37.992925 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.992894 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:37.993473 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:37.993461 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7znnp" Apr 22 21:09:38.835534 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:38.835347 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:38.835682 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:38.835602 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:38.999637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:38.999610 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:09:39.000384 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:38.999929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"d58afc5d5d4d320c51ec63292626624bcb1202b5b6fc23016d25230b5d8b2c16"} Apr 22 21:09:39.000384 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.000222 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:39.000513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.000492 2568 scope.go:117] "RemoveContainer" containerID="2975936029a9e254b9e709df308ae5e9a39e0a72d278635a8f26fcf4c6d7e746" Apr 22 21:09:39.002857 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.002827 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="56af54e6b75fc58d8683c37d14bede482c543cb27768452ef2a4166f7ab1906b" exitCode=0 Apr 22 21:09:39.003513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.003487 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"56af54e6b75fc58d8683c37d14bede482c543cb27768452ef2a4166f7ab1906b"} Apr 22 21:09:39.017079 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.017054 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:39.835203 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.835166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:39.835381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:39.835183 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:39.835381 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:39.835320 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:39.835504 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:39.835404 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:40.007873 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.007846 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:09:40.008263 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.008140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" event={"ID":"7024fa13-11c9-4df5-bb63-12212dd14ff1","Type":"ContainerStarted","Data":"4a1319a8f5628dd56b6e6f635a90c395af1e3e356601fafaaf2a748721ef0aa3"} Apr 22 21:09:40.008408 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.008379 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:40.008476 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.008415 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:40.023114 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.023084 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:09:40.033252 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.033200 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" podStartSLOduration=10.108869339 podStartE2EDuration="27.033187093s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.447731661 +0000 UTC m=+3.191780354" lastFinishedPulling="2026-04-22 21:09:33.372049425 +0000 UTC m=+20.116098108" observedRunningTime="2026-04-22 21:09:40.032050186 +0000 UTC m=+26.776098885" watchObservedRunningTime="2026-04-22 21:09:40.033187093 +0000 UTC m=+26.777235791" Apr 22 21:09:40.334053 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.334015 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gkkxr"] Apr 22 21:09:40.334234 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.334163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:40.334310 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:40.334266 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:40.337070 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.337040 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s8cvq"] Apr 22 21:09:40.337248 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.337163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:40.337336 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:40.337287 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:40.337410 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.337330 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hptqt"] Apr 22 21:09:40.337472 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:40.337463 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:40.337592 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:40.337571 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:41.011889 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:41.011854 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="cf0b6a1613583f55d6899cfedfef741d8160c5cc3b5130c82f7124956e04ec71" exitCode=0 Apr 22 21:09:41.012326 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:41.011937 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"cf0b6a1613583f55d6899cfedfef741d8160c5cc3b5130c82f7124956e04ec71"} Apr 22 21:09:41.835715 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:41.835538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:41.835895 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:41.835538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:41.835895 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:41.835787 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:41.835967 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:41.835538 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:41.835967 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:41.835908 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:41.836029 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:41.835983 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:43.017812 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:43.017777 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="c77a9be6e497812c3f2b34c28163c9641926fe73ce225bfae1e41ae4f7bba77a" exitCode=0 Apr 22 21:09:43.018368 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:43.017824 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"c77a9be6e497812c3f2b34c28163c9641926fe73ce225bfae1e41ae4f7bba77a"} Apr 22 21:09:43.837903 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:43.837873 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:43.838088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:43.837874 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:43.838088 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:43.838000 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:43.838088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:43.837873 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:43.838088 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:43.838050 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:43.838288 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:43.838117 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:45.835009 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:45.834967 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:45.835009 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:45.835007 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:45.835653 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:45.835007 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:45.835653 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:45.835106 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-s8cvq" podUID="68e95be5-6911-44d9-88c0-a14e0becfcb5" Apr 22 21:09:45.835653 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:45.835176 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gkkxr" podUID="62f5ce17-e153-446d-9866-1da5180f3d9a" Apr 22 21:09:45.835653 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:45.835295 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:09:46.532077 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.532044 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-252.ec2.internal" event="NodeReady" Apr 22 21:09:46.532331 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.532173 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 21:09:46.570637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.570607 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qvbng"] Apr 22 21:09:46.594748 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.594708 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m9f26"] Apr 22 21:09:46.594951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.594920 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.597441 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.597418 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 21:09:46.597562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.597443 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 21:09:46.597562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.597445 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8d48v\"" Apr 22 21:09:46.612621 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.612598 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qvbng"] Apr 22 21:09:46.612621 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.612625 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m9f26"] Apr 22 21:09:46.612779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.612721 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:46.616714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.616689 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 21:09:46.616714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.616706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 21:09:46.616884 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.616783 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 21:09:46.616976 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.616963 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tq52g\"" Apr 22 21:09:46.757431 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757385 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmz8\" (UniqueName: \"kubernetes.io/projected/cd68afce-7631-4765-af7c-a614caf39491-kube-api-access-vzmz8\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:46.757624 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757488 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9s8v\" (UniqueName: \"kubernetes.io/projected/fd364cec-0032-4596-8b42-09cb588be2ad-kube-api-access-n9s8v\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.757624 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd364cec-0032-4596-8b42-09cb588be2ad-config-volume\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.757624 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.757766 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd364cec-0032-4596-8b42-09cb588be2ad-tmp-dir\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.757766 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.757720 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:46.858189 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmz8\" (UniqueName: \"kubernetes.io/projected/cd68afce-7631-4765-af7c-a614caf39491-kube-api-access-vzmz8\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9s8v\" (UniqueName: \"kubernetes.io/projected/fd364cec-0032-4596-8b42-09cb588be2ad-kube-api-access-n9s8v\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858235 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd364cec-0032-4596-8b42-09cb588be2ad-config-volume\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858299 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd364cec-0032-4596-8b42-09cb588be2ad-tmp-dir\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:46.858493 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:46.858507 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:46.858560 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.358541095 +0000 UTC m=+34.102589774 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:09:46.858789 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:46.858578 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:09:47.358568626 +0000 UTC m=+34.102617303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:09:46.859170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd364cec-0032-4596-8b42-09cb588be2ad-tmp-dir\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.859170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.858905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd364cec-0032-4596-8b42-09cb588be2ad-config-volume\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.871176 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.871145 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9s8v\" (UniqueName: \"kubernetes.io/projected/fd364cec-0032-4596-8b42-09cb588be2ad-kube-api-access-n9s8v\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:46.871341 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:46.871190 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmz8\" (UniqueName: \"kubernetes.io/projected/cd68afce-7631-4765-af7c-a614caf39491-kube-api-access-vzmz8\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:47.363538 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.363499 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:47.363801 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.363555 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:47.363801 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.363670 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:47.363801 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.363676 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:47.363801 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.363728 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:48.363712269 +0000 UTC m=+35.107760952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:09:47.363801 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.363741 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:09:48.363735255 +0000 UTC m=+35.107783932 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:09:47.464407 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.464356 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:47.464580 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.464540 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:47.464649 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.464635 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:10:19.464613963 +0000 UTC m=+66.208662660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:47.666757 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.666671 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:47.666919 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.666822 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:47.666919 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.666852 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:47.666919 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.666866 2568 projected.go:194] Error preparing data for projected volume kube-api-access-khrzb for pod openshift-network-diagnostics/network-check-target-s8cvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:47.667054 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:47.666922 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb podName:68e95be5-6911-44d9-88c0-a14e0becfcb5 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:19.666908385 +0000 UTC m=+66.410957067 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-khrzb" (UniqueName: "kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb") pod "network-check-target-s8cvq" (UID: "68e95be5-6911-44d9-88c0-a14e0becfcb5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:47.835127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.835091 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:09:47.835303 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.835132 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:47.835303 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.835173 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:09:47.838572 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.838524 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 21:09:47.838572 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.838547 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:09:47.839705 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.839653 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:09:47.839705 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.839670 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:09:47.839705 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.839656 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bw4nx\"" Apr 22 21:09:47.839705 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:47.839685 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gqtcf\"" Apr 22 21:09:48.373614 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:48.373575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:48.374130 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:48.373644 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:48.374130 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:48.373759 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:48.374130 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:48.373838 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:09:50.373819669 +0000 UTC m=+37.117868348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:09:48.374130 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:48.373760 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:48.374130 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:48.373925 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:50.373907672 +0000 UTC m=+37.117956354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:09:50.391038 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:50.390855 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:50.391445 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:50.391062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:50.391445 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:50.390988 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:50.391445 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:50.391151 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:50.391445 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:50.391155 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:09:54.391138433 +0000 UTC m=+41.135187114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:09:50.391445 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:50.391196 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:54.391184895 +0000 UTC m=+41.135233573 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:09:51.037901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:51.037859 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="23bf94e5490845917ce3cdd5aad05468e3d7d2a50a25007313bb155e32f53e19" exitCode=0 Apr 22 21:09:51.037901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:51.037904 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"23bf94e5490845917ce3cdd5aad05468e3d7d2a50a25007313bb155e32f53e19"} Apr 22 21:09:51.803547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:51.803495 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:51.806551 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:51.806520 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62f5ce17-e153-446d-9866-1da5180f3d9a-original-pull-secret\") pod \"global-pull-secret-syncer-gkkxr\" (UID: \"62f5ce17-e153-446d-9866-1da5180f3d9a\") " pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:52.045487 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:52.045448 2568 generic.go:358] "Generic (PLEG): container finished" podID="f02b6849-41ba-4491-9415-4a546ba5e3bb" containerID="633408fa95378df93c68b1b6989b6cf3158194b79f5be8a8af54e3e6214d634a" exitCode=0 Apr 22 21:09:52.045650 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:52.045510 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerDied","Data":"633408fa95378df93c68b1b6989b6cf3158194b79f5be8a8af54e3e6214d634a"} Apr 22 21:09:52.056496 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:52.056441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gkkxr" Apr 22 21:09:52.194279 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:52.194249 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gkkxr"] Apr 22 21:09:52.197722 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:09:52.197689 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f5ce17_e153_446d_9866_1da5180f3d9a.slice/crio-fd6928826ad8a48e37a0f23530a6228abfa6343e991e3949f9b9545aef6b3cf8 WatchSource:0}: Error finding container fd6928826ad8a48e37a0f23530a6228abfa6343e991e3949f9b9545aef6b3cf8: Status 404 returned error can't find the container with id fd6928826ad8a48e37a0f23530a6228abfa6343e991e3949f9b9545aef6b3cf8 Apr 22 21:09:53.050722 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:53.050677 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" event={"ID":"f02b6849-41ba-4491-9415-4a546ba5e3bb","Type":"ContainerStarted","Data":"27c9f5c95634154616e6041832fd1f761a82f5afd5349acb67c8972e9dc5f47d"} Apr 22 21:09:53.051786 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:53.051755 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gkkxr" event={"ID":"62f5ce17-e153-446d-9866-1da5180f3d9a","Type":"ContainerStarted","Data":"fd6928826ad8a48e37a0f23530a6228abfa6343e991e3949f9b9545aef6b3cf8"} Apr 22 21:09:53.071697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:53.071420 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dr8fz" podStartSLOduration=6.494506954 podStartE2EDuration="40.071379508s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:16.438897851 +0000 UTC m=+3.182946532" lastFinishedPulling="2026-04-22 21:09:50.015770396 +0000 UTC m=+36.759819086" observedRunningTime="2026-04-22 21:09:53.071090424 +0000 UTC m=+39.815139124" watchObservedRunningTime="2026-04-22 21:09:53.071379508 +0000 UTC m=+39.815428209" Apr 22 21:09:54.421894 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:54.421856 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:09:54.422293 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:54.421918 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:09:54.422293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:54.422022 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:54.422293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:54.422097 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:10:02.422083036 +0000 UTC m=+49.166131713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:09:54.422293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:54.422032 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:54.422293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:09:54.422167 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:02.422154989 +0000 UTC m=+49.166203669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:09:56.059252 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:56.059207 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gkkxr" event={"ID":"62f5ce17-e153-446d-9866-1da5180f3d9a","Type":"ContainerStarted","Data":"532f7d7310ae0c93ec17d62693b02999795a3703316c4f176ff9f8a8741661fe"} Apr 22 21:09:56.077186 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:09:56.077012 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gkkxr" podStartSLOduration=32.337776358 podStartE2EDuration="36.076994189s" podCreationTimestamp="2026-04-22 21:09:20 +0000 UTC" firstStartedPulling="2026-04-22 21:09:52.199433781 +0000 UTC m=+38.943482457" lastFinishedPulling="2026-04-22 21:09:55.938651607 +0000 UTC m=+42.682700288" observedRunningTime="2026-04-22 21:09:56.076026537 +0000 UTC m=+42.820075236" watchObservedRunningTime="2026-04-22 21:09:56.076994189 +0000 UTC m=+42.821042887" Apr 22 21:10:02.481155 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:02.481113 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:10:02.481579 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:02.481173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:10:02.481579 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:02.481280 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:10:02.481579 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:02.481316 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:10:02.481579 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:02.481345 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:10:18.481330806 +0000 UTC m=+65.225379483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:10:02.481579 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:02.481378 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:18.481359199 +0000 UTC m=+65.225407879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:10:12.030015 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:12.029983 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j6pw2" Apr 22 21:10:18.493585 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:18.493531 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:10:18.493585 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:18.493601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:10:18.494031 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:18.493689 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:10:18.494031 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:18.493754 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:10:50.493737104 +0000 UTC m=+97.237785781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:10:18.494031 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:18.493695 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:10:18.494031 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:18.493821 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:50.49380789 +0000 UTC m=+97.237856568 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:10:19.501099 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.501054 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:10:19.504221 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.504198 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:10:19.511432 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:19.511412 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:10:19.511499 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:19.511474 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:11:23.511457868 +0000 UTC m=+130.255506544 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : secret "metrics-daemon-secret" not found Apr 22 21:10:19.702756 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.702712 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:10:19.705788 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.705770 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:10:19.714947 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.714926 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:10:19.726171 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.726148 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrzb\" (UniqueName: \"kubernetes.io/projected/68e95be5-6911-44d9-88c0-a14e0becfcb5-kube-api-access-khrzb\") pod \"network-check-target-s8cvq\" (UID: \"68e95be5-6911-44d9-88c0-a14e0becfcb5\") " pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:10:19.963761 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.963728 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gqtcf\"" Apr 22 21:10:19.971635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:19.971604 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:10:20.112095 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:20.112063 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-s8cvq"] Apr 22 21:10:20.115327 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:10:20.115293 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68e95be5_6911_44d9_88c0_a14e0becfcb5.slice/crio-6a3ab7b7bc6112e9555ccddcbbed5709061d7b37e45667ca8c80e0d4392608c3 WatchSource:0}: Error finding container 6a3ab7b7bc6112e9555ccddcbbed5709061d7b37e45667ca8c80e0d4392608c3: Status 404 returned error can't find the container with id 6a3ab7b7bc6112e9555ccddcbbed5709061d7b37e45667ca8c80e0d4392608c3 Apr 22 21:10:21.109808 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:21.109772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s8cvq" event={"ID":"68e95be5-6911-44d9-88c0-a14e0becfcb5","Type":"ContainerStarted","Data":"6a3ab7b7bc6112e9555ccddcbbed5709061d7b37e45667ca8c80e0d4392608c3"} Apr 22 21:10:24.116451 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:24.116410 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-s8cvq" event={"ID":"68e95be5-6911-44d9-88c0-a14e0becfcb5","Type":"ContainerStarted","Data":"3851186e498ae56e43b1427b9b05ff48bdce160dd1209fd405251be6b720b1c6"} Apr 22 21:10:24.116822 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:24.116530 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:10:24.131376 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:24.131327 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-s8cvq" podStartSLOduration=66.307609373 podStartE2EDuration="1m10.131312631s" podCreationTimestamp="2026-04-22 21:09:14 +0000 UTC" firstStartedPulling="2026-04-22 21:10:20.116921781 +0000 UTC m=+66.860970457" lastFinishedPulling="2026-04-22 21:10:23.940625024 +0000 UTC m=+70.684673715" observedRunningTime="2026-04-22 21:10:24.130720039 +0000 UTC m=+70.874768755" watchObservedRunningTime="2026-04-22 21:10:24.131312631 +0000 UTC m=+70.875361330" Apr 22 21:10:50.510302 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:50.510252 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:10:50.510819 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:50.510326 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:10:50.510819 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:50.510449 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:10:50.510819 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:50.510505 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:10:50.510819 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:50.510522 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls podName:fd364cec-0032-4596-8b42-09cb588be2ad nodeName:}" failed. No retries permitted until 2026-04-22 21:11:54.510506715 +0000 UTC m=+161.254555396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls") pod "dns-default-qvbng" (UID: "fd364cec-0032-4596-8b42-09cb588be2ad") : secret "dns-default-metrics-tls" not found Apr 22 21:10:50.510819 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:50.510580 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert podName:cd68afce-7631-4765-af7c-a614caf39491 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:54.510560864 +0000 UTC m=+161.254609544 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert") pod "ingress-canary-m9f26" (UID: "cd68afce-7631-4765-af7c-a614caf39491") : secret "canary-serving-cert" not found Apr 22 21:10:55.120896 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:55.120854 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-s8cvq" Apr 22 21:10:59.374664 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.374630 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb"] Apr 22 21:10:59.376995 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.376976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.379278 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.379251 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 21:10:59.379470 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.379453 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 21:10:59.379591 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.379482 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 21:10:59.381569 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.381546 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ttp88\"" Apr 22 21:10:59.381768 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.381755 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 21:10:59.387209 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.387187 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb"] Apr 22 21:10:59.467916 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.467854 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rpw\" (UniqueName: \"kubernetes.io/projected/533714a1-f27e-40c7-8284-efe7ee67acf7-kube-api-access-m6rpw\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.468081 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.467957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.468081 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.468012 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/533714a1-f27e-40c7-8284-efe7ee67acf7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.568674 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.568630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/533714a1-f27e-40c7-8284-efe7ee67acf7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.568880 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.568689 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rpw\" (UniqueName: \"kubernetes.io/projected/533714a1-f27e-40c7-8284-efe7ee67acf7-kube-api-access-m6rpw\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.568880 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.568736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.569002 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:59.568925 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:10:59.569051 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:10:59.569009 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:00.068986781 +0000 UTC m=+106.813035474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:10:59.569460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.569436 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/533714a1-f27e-40c7-8284-efe7ee67acf7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:10:59.582032 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:10:59.582004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rpw\" (UniqueName: \"kubernetes.io/projected/533714a1-f27e-40c7-8284-efe7ee67acf7-kube-api-access-m6rpw\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:00.072789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:00.072742 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:00.072974 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:00.072899 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:00.073016 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:00.072982 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:01.072965486 +0000 UTC m=+107.817014164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:01.079262 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:01.079205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:01.079703 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:01.079363 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:01.079703 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:01.079467 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:03.079449895 +0000 UTC m=+109.823498572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:03.092607 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:03.092568 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:03.093005 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:03.092725 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:03.093005 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:03.092791 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:07.092775764 +0000 UTC m=+113.836824445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:04.733223 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:04.733193 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ctrd_04aad4f5-cc85-41af-a470-1fa752e56411/dns-node-resolver/0.log" Apr 22 21:11:05.733430 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:05.733383 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-thbtv_91e31d0a-9404-4d86-a9a6-1f28187dbd99/node-ca/0.log" Apr 22 21:11:07.121234 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:07.121173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:07.121653 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:07.121353 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:07.121653 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:07.121454 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:15.121438111 +0000 UTC m=+121.865486792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:09.615440 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.615379 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd"] Apr 22 21:11:09.617250 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.617232 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sc42w"] Apr 22 21:11:09.617432 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.617386 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.619714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.619683 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 21:11:09.619837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.619718 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 21:11:09.619837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.619782 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:11:09.619837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.619725 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7dqkf\"" Apr 22 21:11:09.619837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.619720 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.621834 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.621814 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:11:09.621935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.621814 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 21:11:09.621935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.621898 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 21:11:09.622547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.622529 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 21:11:09.622547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.622542 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-cfb5b\"" Apr 22 21:11:09.627861 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.627842 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 21:11:09.628190 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.628143 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd"] Apr 22 21:11:09.629204 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.629182 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sc42w"] Apr 22 21:11:09.738019 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.737978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.738019 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.738022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7qgf\" (UniqueName: \"kubernetes.io/projected/33c46c61-c2a6-4c05-bf38-c25734b80329-kube-api-access-w7qgf\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.738239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.738045 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-config\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.738239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.738102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c46c61-c2a6-4c05-bf38-c25734b80329-serving-cert\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.738239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.738125 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjwb\" (UniqueName: \"kubernetes.io/projected/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-kube-api-access-wsjwb\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.738239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.738148 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-trusted-ca\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.838980 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.838951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c46c61-c2a6-4c05-bf38-c25734b80329-serving-cert\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.838980 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.838985 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjwb\" (UniqueName: \"kubernetes.io/projected/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-kube-api-access-wsjwb\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.839218 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-trusted-ca\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.839218 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839187 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.839337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7qgf\" (UniqueName: \"kubernetes.io/projected/33c46c61-c2a6-4c05-bf38-c25734b80329-kube-api-access-w7qgf\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.839337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-config\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.839337 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:09.839323 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:11:09.839509 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:09.839410 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls podName:2fb993f7-e16d-49e3-ba93-54b4ff9d7c20 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:10.339372736 +0000 UTC m=+117.083421428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-znqrd" (UID: "2fb993f7-e16d-49e3-ba93-54b4ff9d7c20") : secret "samples-operator-tls" not found Apr 22 21:11:09.839812 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-trusted-ca\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.839865 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.839845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c46c61-c2a6-4c05-bf38-c25734b80329-config\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.841369 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.841351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c46c61-c2a6-4c05-bf38-c25734b80329-serving-cert\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.847848 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.847816 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7qgf\" (UniqueName: \"kubernetes.io/projected/33c46c61-c2a6-4c05-bf38-c25734b80329-kube-api-access-w7qgf\") pod \"console-operator-9d4b6777b-sc42w\" (UID: \"33c46c61-c2a6-4c05-bf38-c25734b80329\") " pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:09.850027 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.849994 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjwb\" (UniqueName: \"kubernetes.io/projected/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-kube-api-access-wsjwb\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:09.935298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:09.935207 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:10.048128 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:10.048093 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-sc42w"] Apr 22 21:11:10.051058 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:10.051032 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33c46c61_c2a6_4c05_bf38_c25734b80329.slice/crio-65b4dbae90bf791c9deb3b90e17de7f9b1f0b49156d59af9eaf3402d0a04e1e2 WatchSource:0}: Error finding container 65b4dbae90bf791c9deb3b90e17de7f9b1f0b49156d59af9eaf3402d0a04e1e2: Status 404 returned error can't find the container with id 65b4dbae90bf791c9deb3b90e17de7f9b1f0b49156d59af9eaf3402d0a04e1e2 Apr 22 21:11:10.206429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:10.206317 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" event={"ID":"33c46c61-c2a6-4c05-bf38-c25734b80329","Type":"ContainerStarted","Data":"65b4dbae90bf791c9deb3b90e17de7f9b1f0b49156d59af9eaf3402d0a04e1e2"} Apr 22 21:11:10.343531 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:10.343494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:10.343707 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:10.343604 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:11:10.343707 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:10.343685 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls podName:2fb993f7-e16d-49e3-ba93-54b4ff9d7c20 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:11.34367022 +0000 UTC m=+118.087718897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-znqrd" (UID: "2fb993f7-e16d-49e3-ba93-54b4ff9d7c20") : secret "samples-operator-tls" not found Apr 22 21:11:11.354401 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:11.354344 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:11.354867 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:11.354524 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:11:11.354867 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:11.354614 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls podName:2fb993f7-e16d-49e3-ba93-54b4ff9d7c20 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:13.354592641 +0000 UTC m=+120.098641334 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-znqrd" (UID: "2fb993f7-e16d-49e3-ba93-54b4ff9d7c20") : secret "samples-operator-tls" not found Apr 22 21:11:12.211531 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:12.211504 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/0.log" Apr 22 21:11:12.211693 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:12.211544 2568 generic.go:358] "Generic (PLEG): container finished" podID="33c46c61-c2a6-4c05-bf38-c25734b80329" containerID="7e29309cf1a40437f074c6741bbd27f5b742d2e088ccebc7157684eb880835cb" exitCode=255 Apr 22 21:11:12.211693 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:12.211611 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" event={"ID":"33c46c61-c2a6-4c05-bf38-c25734b80329","Type":"ContainerDied","Data":"7e29309cf1a40437f074c6741bbd27f5b742d2e088ccebc7157684eb880835cb"} Apr 22 21:11:12.211846 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:12.211831 2568 scope.go:117] "RemoveContainer" containerID="7e29309cf1a40437f074c6741bbd27f5b742d2e088ccebc7157684eb880835cb" Apr 22 21:11:13.215054 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215025 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/1.log" Apr 22 21:11:13.215466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215412 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/0.log" Apr 22 21:11:13.215466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215445 2568 generic.go:358] "Generic (PLEG): container finished" podID="33c46c61-c2a6-4c05-bf38-c25734b80329" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" exitCode=255 Apr 22 21:11:13.215540 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" event={"ID":"33c46c61-c2a6-4c05-bf38-c25734b80329","Type":"ContainerDied","Data":"b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206"} Apr 22 21:11:13.215540 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215516 2568 scope.go:117] "RemoveContainer" containerID="7e29309cf1a40437f074c6741bbd27f5b742d2e088ccebc7157684eb880835cb" Apr 22 21:11:13.215756 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.215741 2568 scope.go:117] "RemoveContainer" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" Apr 22 21:11:13.215943 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:13.215923 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:13.371889 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:13.371837 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:13.372037 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:13.371958 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:11:13.372093 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:13.372063 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls podName:2fb993f7-e16d-49e3-ba93-54b4ff9d7c20 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:17.372032599 +0000 UTC m=+124.116081302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-znqrd" (UID: "2fb993f7-e16d-49e3-ba93-54b4ff9d7c20") : secret "samples-operator-tls" not found Apr 22 21:11:14.218576 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:14.218543 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/1.log" Apr 22 21:11:14.218947 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:14.218901 2568 scope.go:117] "RemoveContainer" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" Apr 22 21:11:14.219074 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:14.219056 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:15.186263 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.186206 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:15.186458 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:15.186362 2568 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:15.186458 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:15.186444 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls podName:533714a1-f27e-40c7-8284-efe7ee67acf7 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:31.186425969 +0000 UTC m=+137.930474656 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-njxpb" (UID: "533714a1-f27e-40c7-8284-efe7ee67acf7") : secret "cluster-monitoring-operator-tls" not found Apr 22 21:11:15.483703 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.483633 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8"] Apr 22 21:11:15.487706 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.487689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" Apr 22 21:11:15.489935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.489913 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-kcpz8\"" Apr 22 21:11:15.490028 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.489973 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 21:11:15.490841 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.490827 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 21:11:15.492846 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.492821 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8"] Apr 22 21:11:15.590041 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.590002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmlkx\" (UniqueName: \"kubernetes.io/projected/dcd1d000-ff39-4932-83e0-62a9d1c5575b-kube-api-access-tmlkx\") pod \"migrator-74bb7799d9-vhcs8\" (UID: \"dcd1d000-ff39-4932-83e0-62a9d1c5575b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" Apr 22 21:11:15.691262 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.691226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmlkx\" (UniqueName: \"kubernetes.io/projected/dcd1d000-ff39-4932-83e0-62a9d1c5575b-kube-api-access-tmlkx\") pod \"migrator-74bb7799d9-vhcs8\" (UID: \"dcd1d000-ff39-4932-83e0-62a9d1c5575b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" Apr 22 21:11:15.698969 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.698939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmlkx\" (UniqueName: \"kubernetes.io/projected/dcd1d000-ff39-4932-83e0-62a9d1c5575b-kube-api-access-tmlkx\") pod \"migrator-74bb7799d9-vhcs8\" (UID: \"dcd1d000-ff39-4932-83e0-62a9d1c5575b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" Apr 22 21:11:15.797155 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.797067 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" Apr 22 21:11:15.910685 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:15.910638 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8"] Apr 22 21:11:15.913358 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:15.913326 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd1d000_ff39_4932_83e0_62a9d1c5575b.slice/crio-5bd5a3388a19d24745729160f7709f8c21215ac889551e243e06c71bbfae9f8b WatchSource:0}: Error finding container 5bd5a3388a19d24745729160f7709f8c21215ac889551e243e06c71bbfae9f8b: Status 404 returned error can't find the container with id 5bd5a3388a19d24745729160f7709f8c21215ac889551e243e06c71bbfae9f8b Apr 22 21:11:16.223931 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.223897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" event={"ID":"dcd1d000-ff39-4932-83e0-62a9d1c5575b","Type":"ContainerStarted","Data":"5bd5a3388a19d24745729160f7709f8c21215ac889551e243e06c71bbfae9f8b"} Apr 22 21:11:16.368362 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.368329 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kdgqq"] Apr 22 21:11:16.372706 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.372065 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.374825 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.374768 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rwlrc\"" Apr 22 21:11:16.375034 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.374803 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 21:11:16.375034 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.374849 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 21:11:16.375212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.375137 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 21:11:16.375212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.375182 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 21:11:16.380893 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.380847 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kdgqq"] Apr 22 21:11:16.496915 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.496817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/befc466d-f224-4b13-8b92-963767ecc9a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.496915 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.496866 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.497372 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.496941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/befc466d-f224-4b13-8b92-963767ecc9a0-crio-socket\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.497372 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.497011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/befc466d-f224-4b13-8b92-963767ecc9a0-data-volume\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.497372 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.497042 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfc8g\" (UniqueName: \"kubernetes.io/projected/befc466d-f224-4b13-8b92-963767ecc9a0-kube-api-access-gfc8g\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597448 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/befc466d-f224-4b13-8b92-963767ecc9a0-crio-socket\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/befc466d-f224-4b13-8b92-963767ecc9a0-data-volume\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfc8g\" (UniqueName: \"kubernetes.io/projected/befc466d-f224-4b13-8b92-963767ecc9a0-kube-api-access-gfc8g\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/befc466d-f224-4b13-8b92-963767ecc9a0-crio-socket\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597576 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/befc466d-f224-4b13-8b92-963767ecc9a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597901 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:16.597783 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:16.597901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.597810 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/befc466d-f224-4b13-8b92-963767ecc9a0-data-volume\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.597901 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:16.597840 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls podName:befc466d-f224-4b13-8b92-963767ecc9a0 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:17.097824918 +0000 UTC m=+123.841873594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kdgqq" (UID: "befc466d-f224-4b13-8b92-963767ecc9a0") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:16.598117 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.598097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/befc466d-f224-4b13-8b92-963767ecc9a0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:16.610138 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:16.610106 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfc8g\" (UniqueName: \"kubernetes.io/projected/befc466d-f224-4b13-8b92-963767ecc9a0-kube-api-access-gfc8g\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:17.101032 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:17.100980 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:17.101187 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:17.101140 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:17.101230 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:17.101214 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls podName:befc466d-f224-4b13-8b92-963767ecc9a0 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:18.101196992 +0000 UTC m=+124.845245673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kdgqq" (UID: "befc466d-f224-4b13-8b92-963767ecc9a0") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:17.227610 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:17.227521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" event={"ID":"dcd1d000-ff39-4932-83e0-62a9d1c5575b","Type":"ContainerStarted","Data":"5dc2c87c446a00a12072fd0c5238597470b9e473525ca85d40e5ce2e01d075ab"} Apr 22 21:11:17.227610 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:17.227563 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" event={"ID":"dcd1d000-ff39-4932-83e0-62a9d1c5575b","Type":"ContainerStarted","Data":"b8652864f37013ecff82f62bc51b67daa7db8f6d46fbcddf3d83cc7846be063f"} Apr 22 21:11:17.244628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:17.244566 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vhcs8" podStartSLOduration=1.217815155 podStartE2EDuration="2.244550415s" podCreationTimestamp="2026-04-22 21:11:15 +0000 UTC" firstStartedPulling="2026-04-22 21:11:15.915168718 +0000 UTC m=+122.659217399" lastFinishedPulling="2026-04-22 21:11:16.941903982 +0000 UTC m=+123.685952659" observedRunningTime="2026-04-22 21:11:17.242802603 +0000 UTC m=+123.986851302" watchObservedRunningTime="2026-04-22 21:11:17.244550415 +0000 UTC m=+123.988599144" Apr 22 21:11:17.404362 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:17.404319 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:17.404538 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:17.404485 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 21:11:17.404580 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:17.404546 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls podName:2fb993f7-e16d-49e3-ba93-54b4ff9d7c20 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:25.404531438 +0000 UTC m=+132.148580117 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-znqrd" (UID: "2fb993f7-e16d-49e3-ba93-54b4ff9d7c20") : secret "samples-operator-tls" not found Apr 22 21:11:18.109923 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:18.109887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:18.110293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:18.110011 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:18.110293 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:18.110065 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls podName:befc466d-f224-4b13-8b92-963767ecc9a0 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:20.11005024 +0000 UTC m=+126.854098922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kdgqq" (UID: "befc466d-f224-4b13-8b92-963767ecc9a0") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:19.935691 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:19.935646 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:19.935691 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:19.935693 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:19.936129 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:19.936048 2568 scope.go:117] "RemoveContainer" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" Apr 22 21:11:19.936222 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:19.936202 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:20.124933 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:20.124890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:20.125104 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:20.125036 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:20.125145 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:20.125110 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls podName:befc466d-f224-4b13-8b92-963767ecc9a0 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:24.12508448 +0000 UTC m=+130.869133173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls") pod "insights-runtime-extractor-kdgqq" (UID: "befc466d-f224-4b13-8b92-963767ecc9a0") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:23.551670 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:23.551617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:11:23.552152 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:23.551768 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:11:23.552152 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:23.551839 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs podName:605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f nodeName:}" failed. No retries permitted until 2026-04-22 21:13:25.551823192 +0000 UTC m=+252.295871869 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs") pod "network-metrics-daemon-hptqt" (UID: "605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f") : secret "metrics-daemon-secret" not found Apr 22 21:11:24.157309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:24.157244 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:24.159668 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:24.159640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/befc466d-f224-4b13-8b92-963767ecc9a0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kdgqq\" (UID: \"befc466d-f224-4b13-8b92-963767ecc9a0\") " pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:24.184945 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:24.184906 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kdgqq" Apr 22 21:11:24.301800 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:24.301767 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kdgqq"] Apr 22 21:11:24.305838 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:24.305806 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbefc466d_f224_4b13_8b92_963767ecc9a0.slice/crio-47924b6dcd69ff7177631a25e8ebdf6d3679f464b22f8e82c6285277d89ce8f6 WatchSource:0}: Error finding container 47924b6dcd69ff7177631a25e8ebdf6d3679f464b22f8e82c6285277d89ce8f6: Status 404 returned error can't find the container with id 47924b6dcd69ff7177631a25e8ebdf6d3679f464b22f8e82c6285277d89ce8f6 Apr 22 21:11:25.246978 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.246892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kdgqq" event={"ID":"befc466d-f224-4b13-8b92-963767ecc9a0","Type":"ContainerStarted","Data":"fa80c60ff5baf225a817a309ad4ed195b24a66321af601934f382732929ea907"} Apr 22 21:11:25.246978 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.246930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kdgqq" event={"ID":"befc466d-f224-4b13-8b92-963767ecc9a0","Type":"ContainerStarted","Data":"e9bbee8d33b20b696246babc6daac55f4125e201b5af44a83634afd53cf5d4ff"} Apr 22 21:11:25.246978 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.246939 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kdgqq" event={"ID":"befc466d-f224-4b13-8b92-963767ecc9a0","Type":"ContainerStarted","Data":"47924b6dcd69ff7177631a25e8ebdf6d3679f464b22f8e82c6285277d89ce8f6"} Apr 22 21:11:25.467233 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.467176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:25.469945 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.469908 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb993f7-e16d-49e3-ba93-54b4ff9d7c20-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-znqrd\" (UID: \"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:25.531336 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.531263 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-7dqkf\"" Apr 22 21:11:25.539220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.539190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" Apr 22 21:11:25.671278 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:25.671244 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd"] Apr 22 21:11:26.252231 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:26.252176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" event={"ID":"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20","Type":"ContainerStarted","Data":"6567f5a75b7dd3c25e9c102bb5c1e65e32deb12172ae55ab24845980982c6d08"} Apr 22 21:11:27.257705 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:27.257668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kdgqq" event={"ID":"befc466d-f224-4b13-8b92-963767ecc9a0","Type":"ContainerStarted","Data":"8e5876ef3148536da9ef9b5f460523a4bccd86b3feefc15140b3315a1eb99ad1"} Apr 22 21:11:27.277139 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:27.277082 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kdgqq" podStartSLOduration=9.304187744 podStartE2EDuration="11.277063421s" podCreationTimestamp="2026-04-22 21:11:16 +0000 UTC" firstStartedPulling="2026-04-22 21:11:24.366930248 +0000 UTC m=+131.110978926" lastFinishedPulling="2026-04-22 21:11:26.339805912 +0000 UTC m=+133.083854603" observedRunningTime="2026-04-22 21:11:27.275024926 +0000 UTC m=+134.019073638" watchObservedRunningTime="2026-04-22 21:11:27.277063421 +0000 UTC m=+134.021112120" Apr 22 21:11:28.262021 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:28.261980 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" event={"ID":"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20","Type":"ContainerStarted","Data":"8ec7a6c6863f420b0e8b721ea161419b50ef0181cf61196175e1a3fad712f393"} Apr 22 21:11:28.262021 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:28.262020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" event={"ID":"2fb993f7-e16d-49e3-ba93-54b4ff9d7c20","Type":"ContainerStarted","Data":"c23ed092404053c6fb4a6637d10d11863ace93f45a9a8c91835a4dbdd9791f9f"} Apr 22 21:11:28.277632 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:28.277585 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-znqrd" podStartSLOduration=17.403047955 podStartE2EDuration="19.277569943s" podCreationTimestamp="2026-04-22 21:11:09 +0000 UTC" firstStartedPulling="2026-04-22 21:11:25.726485248 +0000 UTC m=+132.470533929" lastFinishedPulling="2026-04-22 21:11:27.601007236 +0000 UTC m=+134.345055917" observedRunningTime="2026-04-22 21:11:28.277084682 +0000 UTC m=+135.021133382" watchObservedRunningTime="2026-04-22 21:11:28.277569943 +0000 UTC m=+135.021618750" Apr 22 21:11:31.214555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:31.214496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:31.216875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:31.216848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/533714a1-f27e-40c7-8284-efe7ee67acf7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-njxpb\" (UID: \"533714a1-f27e-40c7-8284-efe7ee67acf7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:31.487924 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:31.487842 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ttp88\"" Apr 22 21:11:31.495840 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:31.495817 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" Apr 22 21:11:31.610591 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:31.610556 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb"] Apr 22 21:11:31.614571 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:31.614538 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod533714a1_f27e_40c7_8284_efe7ee67acf7.slice/crio-c81b7eceb1f63a0e7774545710151e04c1b61a4e344b813fdd428f88212b97a7 WatchSource:0}: Error finding container c81b7eceb1f63a0e7774545710151e04c1b61a4e344b813fdd428f88212b97a7: Status 404 returned error can't find the container with id c81b7eceb1f63a0e7774545710151e04c1b61a4e344b813fdd428f88212b97a7 Apr 22 21:11:32.273365 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:32.273326 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" event={"ID":"533714a1-f27e-40c7-8284-efe7ee67acf7","Type":"ContainerStarted","Data":"c81b7eceb1f63a0e7774545710151e04c1b61a4e344b813fdd428f88212b97a7"} Apr 22 21:11:33.277468 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:33.277424 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" event={"ID":"533714a1-f27e-40c7-8284-efe7ee67acf7","Type":"ContainerStarted","Data":"9ed581cb24028bdc18a204ac56a92926a2c3091a6186a7c1883bf466aa790cd8"} Apr 22 21:11:33.292068 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:33.292010 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-njxpb" podStartSLOduration=32.771758634 podStartE2EDuration="34.291996419s" podCreationTimestamp="2026-04-22 21:10:59 +0000 UTC" firstStartedPulling="2026-04-22 21:11:31.616566608 +0000 UTC m=+138.360615285" lastFinishedPulling="2026-04-22 21:11:33.136804389 +0000 UTC m=+139.880853070" observedRunningTime="2026-04-22 21:11:33.291357339 +0000 UTC m=+140.035406041" watchObservedRunningTime="2026-04-22 21:11:33.291996419 +0000 UTC m=+140.036045117" Apr 22 21:11:33.836573 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:33.836545 2568 scope.go:117] "RemoveContainer" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" Apr 22 21:11:34.281386 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.281356 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:11:34.281865 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.281757 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/1.log" Apr 22 21:11:34.281865 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.281793 2568 generic.go:358] "Generic (PLEG): container finished" podID="33c46c61-c2a6-4c05-bf38-c25734b80329" containerID="5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540" exitCode=255 Apr 22 21:11:34.281980 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.281863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" event={"ID":"33c46c61-c2a6-4c05-bf38-c25734b80329","Type":"ContainerDied","Data":"5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540"} Apr 22 21:11:34.281980 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.281900 2568 scope.go:117] "RemoveContainer" containerID="b0ecee078f1b104d110e6f010bab6ded0f79097312076138c872cc60f10a4206" Apr 22 21:11:34.282249 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:34.282232 2568 scope.go:117] "RemoveContainer" containerID="5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540" Apr 22 21:11:34.282406 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:34.282370 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:35.285981 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.285953 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:11:35.634109 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.634071 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6"] Apr 22 21:11:35.638321 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.638295 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zqj22"] Apr 22 21:11:35.638501 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.638487 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" Apr 22 21:11:35.640993 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.640971 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5g64n\"" Apr 22 21:11:35.641342 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.641327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.643506 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.643482 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 21:11:35.643611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.643482 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xxln5\"" Apr 22 21:11:35.643611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.643521 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 21:11:35.646114 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.646097 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6"] Apr 22 21:11:35.649775 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.649755 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zqj22"] Apr 22 21:11:35.749598 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.749567 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b546034a-3b47-42da-a1d5-9685baf19e4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.749745 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.749602 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b546034a-3b47-42da-a1d5-9685baf19e4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.749745 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.749662 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mml7w\" (UniqueName: \"kubernetes.io/projected/50847da2-6189-4704-b652-a6ab02809bf2-kube-api-access-mml7w\") pod \"network-check-source-8894fc9bd-jhrq6\" (UID: \"50847da2-6189-4704-b652-a6ab02809bf2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" Apr 22 21:11:35.850654 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.850619 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b546034a-3b47-42da-a1d5-9685baf19e4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.850654 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.850660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b546034a-3b47-42da-a1d5-9685baf19e4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.850890 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.850694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mml7w\" (UniqueName: \"kubernetes.io/projected/50847da2-6189-4704-b652-a6ab02809bf2-kube-api-access-mml7w\") pod \"network-check-source-8894fc9bd-jhrq6\" (UID: \"50847da2-6189-4704-b652-a6ab02809bf2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" Apr 22 21:11:35.851299 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.851278 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b546034a-3b47-42da-a1d5-9685baf19e4f-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.853096 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.853077 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b546034a-3b47-42da-a1d5-9685baf19e4f-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-zqj22\" (UID: \"b546034a-3b47-42da-a1d5-9685baf19e4f\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:35.860940 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.860911 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mml7w\" (UniqueName: \"kubernetes.io/projected/50847da2-6189-4704-b652-a6ab02809bf2-kube-api-access-mml7w\") pod \"network-check-source-8894fc9bd-jhrq6\" (UID: \"50847da2-6189-4704-b652-a6ab02809bf2\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" Apr 22 21:11:35.950838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.950744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" Apr 22 21:11:35.956467 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:35.956442 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" Apr 22 21:11:36.072691 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.072535 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6"] Apr 22 21:11:36.075330 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:36.075298 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50847da2_6189_4704_b652_a6ab02809bf2.slice/crio-31f4dd191fd598b2b315736d7ba65d4d88f82b1ef77060c9cf561514d18acc83 WatchSource:0}: Error finding container 31f4dd191fd598b2b315736d7ba65d4d88f82b1ef77060c9cf561514d18acc83: Status 404 returned error can't find the container with id 31f4dd191fd598b2b315736d7ba65d4d88f82b1ef77060c9cf561514d18acc83 Apr 22 21:11:36.089211 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.089175 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-zqj22"] Apr 22 21:11:36.092185 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:36.092158 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb546034a_3b47_42da_a1d5_9685baf19e4f.slice/crio-8d955111db8b6d6fdc60d9707b366e1d069597e40c619d4a8d112b5dce1d40d5 WatchSource:0}: Error finding container 8d955111db8b6d6fdc60d9707b366e1d069597e40c619d4a8d112b5dce1d40d5: Status 404 returned error can't find the container with id 8d955111db8b6d6fdc60d9707b366e1d069597e40c619d4a8d112b5dce1d40d5 Apr 22 21:11:36.291299 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.291193 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" event={"ID":"b546034a-3b47-42da-a1d5-9685baf19e4f","Type":"ContainerStarted","Data":"8d955111db8b6d6fdc60d9707b366e1d069597e40c619d4a8d112b5dce1d40d5"} Apr 22 21:11:36.292412 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.292369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" event={"ID":"50847da2-6189-4704-b652-a6ab02809bf2","Type":"ContainerStarted","Data":"e1ec5d7bff2e5408a92c88919f13be7d909a1ed9ae90a7cf30fb8cc37ed73eb4"} Apr 22 21:11:36.292534 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.292415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" event={"ID":"50847da2-6189-4704-b652-a6ab02809bf2","Type":"ContainerStarted","Data":"31f4dd191fd598b2b315736d7ba65d4d88f82b1ef77060c9cf561514d18acc83"} Apr 22 21:11:36.307720 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:36.307675 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-jhrq6" podStartSLOduration=1.307660527 podStartE2EDuration="1.307660527s" podCreationTimestamp="2026-04-22 21:11:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:11:36.306211417 +0000 UTC m=+143.050260114" watchObservedRunningTime="2026-04-22 21:11:36.307660527 +0000 UTC m=+143.051709276" Apr 22 21:11:37.296558 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:37.296520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" event={"ID":"b546034a-3b47-42da-a1d5-9685baf19e4f","Type":"ContainerStarted","Data":"714c08ea694323afa63e317f4cd18636ce53a07444f32b3d7464c44ed1a9e260"} Apr 22 21:11:37.310580 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:37.310526 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-zqj22" podStartSLOduration=1.197345908 podStartE2EDuration="2.310511327s" podCreationTimestamp="2026-04-22 21:11:35 +0000 UTC" firstStartedPulling="2026-04-22 21:11:36.094148041 +0000 UTC m=+142.838196717" lastFinishedPulling="2026-04-22 21:11:37.207313455 +0000 UTC m=+143.951362136" observedRunningTime="2026-04-22 21:11:37.310081866 +0000 UTC m=+144.054130559" watchObservedRunningTime="2026-04-22 21:11:37.310511327 +0000 UTC m=+144.054560025" Apr 22 21:11:38.640144 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.640108 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-664wc"] Apr 22 21:11:38.643434 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.643409 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.646974 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.646944 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 21:11:38.647092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.646944 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 21:11:38.647092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.646996 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 21:11:38.647092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.646952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-g6zz5\"" Apr 22 21:11:38.652958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.652934 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-664wc"] Apr 22 21:11:38.673094 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.673064 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63666cc7-3705-48c1-b0fd-3a0071a0c4de-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.673094 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.673102 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.673316 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.673141 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.673316 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.673237 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wx9\" (UniqueName: \"kubernetes.io/projected/63666cc7-3705-48c1-b0fd-3a0071a0c4de-kube-api-access-78wx9\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.773611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.773567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63666cc7-3705-48c1-b0fd-3a0071a0c4de-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.773611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.773615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.773860 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.773656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.773860 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.773707 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78wx9\" (UniqueName: \"kubernetes.io/projected/63666cc7-3705-48c1-b0fd-3a0071a0c4de-kube-api-access-78wx9\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.773860 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:38.773781 2568 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 21:11:38.774006 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:38.773864 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls podName:63666cc7-3705-48c1-b0fd-3a0071a0c4de nodeName:}" failed. No retries permitted until 2026-04-22 21:11:39.273842011 +0000 UTC m=+146.017890705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-664wc" (UID: "63666cc7-3705-48c1-b0fd-3a0071a0c4de") : secret "prometheus-operator-tls" not found Apr 22 21:11:38.774269 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.774248 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63666cc7-3705-48c1-b0fd-3a0071a0c4de-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.776094 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.776062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:38.783801 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:38.783775 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wx9\" (UniqueName: \"kubernetes.io/projected/63666cc7-3705-48c1-b0fd-3a0071a0c4de-kube-api-access-78wx9\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:39.276600 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.276554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:39.278852 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.278828 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/63666cc7-3705-48c1-b0fd-3a0071a0c4de-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-664wc\" (UID: \"63666cc7-3705-48c1-b0fd-3a0071a0c4de\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:39.552019 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.551911 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" Apr 22 21:11:39.668170 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.668138 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-664wc"] Apr 22 21:11:39.671307 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:39.671276 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63666cc7_3705_48c1_b0fd_3a0071a0c4de.slice/crio-d378b4c896dab6056dcd0f6b39391bdd9a14e1ab061ad34f905f9c31285c62c4 WatchSource:0}: Error finding container d378b4c896dab6056dcd0f6b39391bdd9a14e1ab061ad34f905f9c31285c62c4: Status 404 returned error can't find the container with id d378b4c896dab6056dcd0f6b39391bdd9a14e1ab061ad34f905f9c31285c62c4 Apr 22 21:11:39.936100 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.936048 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:39.936100 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.936104 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:11:39.936505 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:39.936488 2568 scope.go:117] "RemoveContainer" containerID="5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540" Apr 22 21:11:39.936680 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:39.936664 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:40.305515 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:40.305428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" event={"ID":"63666cc7-3705-48c1-b0fd-3a0071a0c4de","Type":"ContainerStarted","Data":"d378b4c896dab6056dcd0f6b39391bdd9a14e1ab061ad34f905f9c31285c62c4"} Apr 22 21:11:41.309931 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:41.309847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" event={"ID":"63666cc7-3705-48c1-b0fd-3a0071a0c4de","Type":"ContainerStarted","Data":"2d46540a30a9a2c06c3f97a7332698015e5bce575114f02806542fa9107cda38"} Apr 22 21:11:41.309931 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:41.309884 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" event={"ID":"63666cc7-3705-48c1-b0fd-3a0071a0c4de","Type":"ContainerStarted","Data":"afdefd77e76a600b4c4b13b174421e93f06270b49123b5f90f2ce4bcb7acbc3d"} Apr 22 21:11:41.325486 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:41.325430 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-664wc" podStartSLOduration=2.052949361 podStartE2EDuration="3.325414588s" podCreationTimestamp="2026-04-22 21:11:38 +0000 UTC" firstStartedPulling="2026-04-22 21:11:39.67314587 +0000 UTC m=+146.417194548" lastFinishedPulling="2026-04-22 21:11:40.945611094 +0000 UTC m=+147.689659775" observedRunningTime="2026-04-22 21:11:41.324594718 +0000 UTC m=+148.068643439" watchObservedRunningTime="2026-04-22 21:11:41.325414588 +0000 UTC m=+148.069463284" Apr 22 21:11:42.951010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.950960 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt"] Apr 22 21:11:42.954276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.954257 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:42.956769 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.956743 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 21:11:42.956906 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.956792 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 21:11:42.956992 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.956976 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-cgl9k\"" Apr 22 21:11:42.957172 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.957153 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c5f6s"] Apr 22 21:11:42.960114 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.960095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:42.962557 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.962515 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-z7mkt\"" Apr 22 21:11:42.962722 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.962706 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 21:11:42.962991 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.962975 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 21:11:42.963636 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.963617 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 21:11:42.963966 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.963946 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt"] Apr 22 21:11:42.977460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.977430 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ltfpz"] Apr 22 21:11:42.980604 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.980583 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c5f6s"] Apr 22 21:11:42.980711 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.980688 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:42.983402 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.983365 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 21:11:42.983516 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.983427 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 21:11:42.983642 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.983627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qctg5\"" Apr 22 21:11:42.983810 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:42.983799 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 21:11:43.004277 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004248 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-textfile\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004464 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.004464 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004323 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004464 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004370 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-root\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004464 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004425 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.004464 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004479 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004549 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhjpb\" (UniqueName: \"kubernetes.io/projected/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-kube-api-access-hhjpb\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004590 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004618 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htp7v\" (UniqueName: \"kubernetes.io/projected/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-kube-api-access-htp7v\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004670 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-sys\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.004737 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004734 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.005025 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004777 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba347a0d-5022-49b0-bbe4-1cb18755020c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.005025 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.005025 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592b4\" (UniqueName: \"kubernetes.io/projected/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-api-access-592b4\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.005025 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004839 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-metrics-client-ca\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.005025 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.004855 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-wtmp\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.105374 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105334 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-sys\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.105374 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105376 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba347a0d-5022-49b0-bbe4-1cb18755020c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105456 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105473 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-sys\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-592b4\" (UniqueName: \"kubernetes.io/projected/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-api-access-592b4\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:43.105594 2568 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-metrics-client-ca\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.105662 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:43.105665 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls podName:ba347a0d-5022-49b0-bbe4-1cb18755020c nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.605644203 +0000 UTC m=+150.349692901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-c5f6s" (UID: "ba347a0d-5022-49b0-bbe4-1cb18755020c") : secret "kube-state-metrics-tls" not found Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105704 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-wtmp\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-textfile\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ba347a0d-5022-49b0-bbe4-1cb18755020c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-root\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-wtmp\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:43.106023 2568 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106053 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhjpb\" (UniqueName: \"kubernetes.io/projected/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-kube-api-access-hhjpb\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:43.106064 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls podName:f2c3bfd7-d56b-43d2-a164-4289b4e780d6 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.606051523 +0000 UTC m=+150.350100225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls") pod "node-exporter-ltfpz" (UID: "f2c3bfd7-d56b-43d2-a164-4289b4e780d6") : secret "node-exporter-tls" not found Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-textfile\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106101 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htp7v\" (UniqueName: \"kubernetes.io/projected/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-kube-api-access-htp7v\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106176 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106232 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-metrics-client-ca\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106240 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.106985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.106985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.105924 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-root\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.106985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.106932 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.108585 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.108556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.108891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.108602 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.108891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.108721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.108891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.108768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.118195 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.118169 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhjpb\" (UniqueName: \"kubernetes.io/projected/168b8bff-402c-4cdc-9b0e-56436a4fd9d8-kube-api-access-hhjpb\") pod \"openshift-state-metrics-9d44df66c-wxhxt\" (UID: \"168b8bff-402c-4cdc-9b0e-56436a4fd9d8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.118369 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.118348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-592b4\" (UniqueName: \"kubernetes.io/projected/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-api-access-592b4\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.118634 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.118616 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htp7v\" (UniqueName: \"kubernetes.io/projected/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-kube-api-access-htp7v\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.263497 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.263405 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" Apr 22 21:11:43.385589 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.385552 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt"] Apr 22 21:11:43.388249 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:43.388222 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168b8bff_402c_4cdc_9b0e_56436a4fd9d8.slice/crio-8ac2de18041e919aae8d2627a97ecbad9ee5d1d21a8dc77ed2996988c1dbd4ce WatchSource:0}: Error finding container 8ac2de18041e919aae8d2627a97ecbad9ee5d1d21a8dc77ed2996988c1dbd4ce: Status 404 returned error can't find the container with id 8ac2de18041e919aae8d2627a97ecbad9ee5d1d21a8dc77ed2996988c1dbd4ce Apr 22 21:11:43.609452 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.609410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.609639 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.609494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.611764 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.611731 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f2c3bfd7-d56b-43d2-a164-4289b4e780d6-node-exporter-tls\") pod \"node-exporter-ltfpz\" (UID: \"f2c3bfd7-d56b-43d2-a164-4289b4e780d6\") " pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.611880 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.611841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ba347a0d-5022-49b0-bbe4-1cb18755020c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-c5f6s\" (UID: \"ba347a0d-5022-49b0-bbe4-1cb18755020c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.870573 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.870527 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" Apr 22 21:11:43.892386 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:43.892347 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ltfpz" Apr 22 21:11:43.901556 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:43.901522 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c3bfd7_d56b_43d2_a164_4289b4e780d6.slice/crio-551780cffc72471b1421c05fb3266eb48b778f279a87dbe86f711899b81fc597 WatchSource:0}: Error finding container 551780cffc72471b1421c05fb3266eb48b778f279a87dbe86f711899b81fc597: Status 404 returned error can't find the container with id 551780cffc72471b1421c05fb3266eb48b778f279a87dbe86f711899b81fc597 Apr 22 21:11:44.004969 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.004939 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-c5f6s"] Apr 22 21:11:44.009742 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:44.009702 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba347a0d_5022_49b0_bbe4_1cb18755020c.slice/crio-e11c073397878856f64df815de0afda4a86ddd715d0c5819b168b52fe20f6b5c WatchSource:0}: Error finding container e11c073397878856f64df815de0afda4a86ddd715d0c5819b168b52fe20f6b5c: Status 404 returned error can't find the container with id e11c073397878856f64df815de0afda4a86ddd715d0c5819b168b52fe20f6b5c Apr 22 21:11:44.028880 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.028839 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:11:44.033372 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.033346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.035888 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.036101 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.036439 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.036667 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.036858 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.037112 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 21:11:44.037460 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.037304 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 21:11:44.037885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.037594 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 21:11:44.037885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.037696 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fxmp4\"" Apr 22 21:11:44.037885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.037864 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 21:11:44.043734 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.043708 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:11:44.113815 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.113219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114118 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114330 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114477 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114462 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114608 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114594 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114704 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114800 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114786 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.114939 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.114919 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.115063 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.115047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.116483 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.116453 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.116577 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.116563 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcft\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.116653 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.116638 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.116696 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.116672 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217514 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217514 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217478 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcft\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217518 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217544 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217725 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217994 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.217994 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.217971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218006 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218031 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218207 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218257 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218233 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.218611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.218574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.221523 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.221359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.221779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.221745 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.221779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.221772 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.221922 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.221834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.222511 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.222487 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.223544 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.223522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.223627 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.223571 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.223627 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.223588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.223966 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.223944 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.225882 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.225862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcft\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft\") pod \"alertmanager-main-0\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.321260 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.321217 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltfpz" event={"ID":"f2c3bfd7-d56b-43d2-a164-4289b4e780d6","Type":"ContainerStarted","Data":"551780cffc72471b1421c05fb3266eb48b778f279a87dbe86f711899b81fc597"} Apr 22 21:11:44.322458 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.322419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" event={"ID":"ba347a0d-5022-49b0-bbe4-1cb18755020c","Type":"ContainerStarted","Data":"e11c073397878856f64df815de0afda4a86ddd715d0c5819b168b52fe20f6b5c"} Apr 22 21:11:44.324191 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.324158 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" event={"ID":"168b8bff-402c-4cdc-9b0e-56436a4fd9d8","Type":"ContainerStarted","Data":"adbb17419510c63314d475abcfcaabb6d7d50d8e8b871ad576b8084af2db8368"} Apr 22 21:11:44.324287 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.324195 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" event={"ID":"168b8bff-402c-4cdc-9b0e-56436a4fd9d8","Type":"ContainerStarted","Data":"6889a38e7e6ca703882aac7de879e9990f83c3b4e09a74834893c1b71f5efa73"} Apr 22 21:11:44.324287 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.324208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" event={"ID":"168b8bff-402c-4cdc-9b0e-56436a4fd9d8","Type":"ContainerStarted","Data":"8ac2de18041e919aae8d2627a97ecbad9ee5d1d21a8dc77ed2996988c1dbd4ce"} Apr 22 21:11:44.349396 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.349355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:11:44.593867 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:44.593832 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a370fd_e71f_40a5_a8df_f5254fc455df.slice/crio-3ef9f84470e15ef7a92761273774bf415783451a7e8f4b13204f5dc58ccafc20 WatchSource:0}: Error finding container 3ef9f84470e15ef7a92761273774bf415783451a7e8f4b13204f5dc58ccafc20: Status 404 returned error can't find the container with id 3ef9f84470e15ef7a92761273774bf415783451a7e8f4b13204f5dc58ccafc20 Apr 22 21:11:44.602311 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:44.602282 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:11:45.329516 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.329429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"3ef9f84470e15ef7a92761273774bf415783451a7e8f4b13204f5dc58ccafc20"} Apr 22 21:11:45.331595 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.331563 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltfpz" event={"ID":"f2c3bfd7-d56b-43d2-a164-4289b4e780d6","Type":"ContainerStarted","Data":"09775c9cda8ca5a19e6296716c5cf3708c14ace06c1c13a7e00d5f7cf2fe4181"} Apr 22 21:11:45.333629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.333599 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" event={"ID":"168b8bff-402c-4cdc-9b0e-56436a4fd9d8","Type":"ContainerStarted","Data":"d69f34274f51c5880fb1b2f72d26a0acc295af4d259194d9173dead89c72a580"} Apr 22 21:11:45.366724 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.366662 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-wxhxt" podStartSLOduration=2.386412639 podStartE2EDuration="3.366647083s" podCreationTimestamp="2026-04-22 21:11:42 +0000 UTC" firstStartedPulling="2026-04-22 21:11:43.511963132 +0000 UTC m=+150.256011810" lastFinishedPulling="2026-04-22 21:11:44.492197557 +0000 UTC m=+151.236246254" observedRunningTime="2026-04-22 21:11:45.365565889 +0000 UTC m=+152.109614588" watchObservedRunningTime="2026-04-22 21:11:45.366647083 +0000 UTC m=+152.110695783" Apr 22 21:11:45.943843 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.943811 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-69bdf86478-lhs6r"] Apr 22 21:11:45.947590 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.947571 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:45.950618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950386 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 21:11:45.950618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950423 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-zzwr8\"" Apr 22 21:11:45.950618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950455 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-858leirnsvgct\"" Apr 22 21:11:45.950618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 21:11:45.950618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950543 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 21:11:45.950965 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950760 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 21:11:45.950965 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.950932 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 21:11:45.958372 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:45.958325 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69bdf86478-lhs6r"] Apr 22 21:11:46.035111 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035069 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035318 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035124 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzbpv\" (UniqueName: \"kubernetes.io/projected/4fd1aa70-782d-45cd-8d2d-dd1426761edb-kube-api-access-nzbpv\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035318 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035243 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035318 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035318 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035319 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035588 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035371 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-grpc-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035588 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035444 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.035588 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.035513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd1aa70-782d-45cd-8d2d-dd1426761edb-metrics-client-ca\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136500 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136462 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzbpv\" (UniqueName: \"kubernetes.io/projected/4fd1aa70-782d-45cd-8d2d-dd1426761edb-kube-api-access-nzbpv\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136530 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136550 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-grpc-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136697 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136632 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.136991 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.136969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd1aa70-782d-45cd-8d2d-dd1426761edb-metrics-client-ca\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.137061 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.137015 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.137843 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.137811 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4fd1aa70-782d-45cd-8d2d-dd1426761edb-metrics-client-ca\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.139360 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.139307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.139506 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.139482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.139645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.139622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.139891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.139870 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.140345 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.140319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.140691 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.140669 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4fd1aa70-782d-45cd-8d2d-dd1426761edb-secret-grpc-tls\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.144218 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.144197 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzbpv\" (UniqueName: \"kubernetes.io/projected/4fd1aa70-782d-45cd-8d2d-dd1426761edb-kube-api-access-nzbpv\") pod \"thanos-querier-69bdf86478-lhs6r\" (UID: \"4fd1aa70-782d-45cd-8d2d-dd1426761edb\") " pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.259602 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.259511 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:46.339855 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.339799 2568 generic.go:358] "Generic (PLEG): container finished" podID="f2c3bfd7-d56b-43d2-a164-4289b4e780d6" containerID="09775c9cda8ca5a19e6296716c5cf3708c14ace06c1c13a7e00d5f7cf2fe4181" exitCode=0 Apr 22 21:11:46.340362 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.340332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltfpz" event={"ID":"f2c3bfd7-d56b-43d2-a164-4289b4e780d6","Type":"ContainerDied","Data":"09775c9cda8ca5a19e6296716c5cf3708c14ace06c1c13a7e00d5f7cf2fe4181"} Apr 22 21:11:46.342853 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.342779 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" event={"ID":"ba347a0d-5022-49b0-bbe4-1cb18755020c","Type":"ContainerStarted","Data":"da42b51ae85a1950e280d4ab4f8e00bc4061f01f3c3e49e66c955144b9b78df3"} Apr 22 21:11:46.342853 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.342821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" event={"ID":"ba347a0d-5022-49b0-bbe4-1cb18755020c","Type":"ContainerStarted","Data":"e69343c354dfab1a053e6ba6f11a9644ee1c9edb2fc6d0f4a9042d75f2b69166"} Apr 22 21:11:46.342853 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.342835 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" event={"ID":"ba347a0d-5022-49b0-bbe4-1cb18755020c","Type":"ContainerStarted","Data":"656fe0f041b904294f276316f6d7555df89adea7fc23911d293ed64b406eb6c0"} Apr 22 21:11:46.344349 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.344325 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="8e231e7bc3e90f7f51c8f6ca1c70a1a66d5315b1d2ff809cad28d41803c93534" exitCode=0 Apr 22 21:11:46.344545 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.344429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"8e231e7bc3e90f7f51c8f6ca1c70a1a66d5315b1d2ff809cad28d41803c93534"} Apr 22 21:11:46.375829 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.375785 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-c5f6s" podStartSLOduration=2.685916304 podStartE2EDuration="4.375772023s" podCreationTimestamp="2026-04-22 21:11:42 +0000 UTC" firstStartedPulling="2026-04-22 21:11:44.012201288 +0000 UTC m=+150.756249981" lastFinishedPulling="2026-04-22 21:11:45.702057009 +0000 UTC m=+152.446105700" observedRunningTime="2026-04-22 21:11:46.375479628 +0000 UTC m=+153.119528328" watchObservedRunningTime="2026-04-22 21:11:46.375772023 +0000 UTC m=+153.119820722" Apr 22 21:11:46.395247 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:46.395216 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-69bdf86478-lhs6r"] Apr 22 21:11:46.399349 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:46.399323 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fd1aa70_782d_45cd_8d2d_dd1426761edb.slice/crio-ae86815609d34dfbcadeffee98115e8fac80f8dda823b3d75c1920c1bdadfb4d WatchSource:0}: Error finding container ae86815609d34dfbcadeffee98115e8fac80f8dda823b3d75c1920c1bdadfb4d: Status 404 returned error can't find the container with id ae86815609d34dfbcadeffee98115e8fac80f8dda823b3d75c1920c1bdadfb4d Apr 22 21:11:47.351310 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.351193 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltfpz" event={"ID":"f2c3bfd7-d56b-43d2-a164-4289b4e780d6","Type":"ContainerStarted","Data":"e90503a142d4d2cff748ff59a7233376b059e4b17337f6a7d0082a08e52852fa"} Apr 22 21:11:47.351310 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.351267 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ltfpz" event={"ID":"f2c3bfd7-d56b-43d2-a164-4289b4e780d6","Type":"ContainerStarted","Data":"9fe3e5eadcb691c8a543c211db8f37ada47364d5a516458a5cbcfddf76535873"} Apr 22 21:11:47.355030 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.353879 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"ae86815609d34dfbcadeffee98115e8fac80f8dda823b3d75c1920c1bdadfb4d"} Apr 22 21:11:47.357414 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.357367 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7fd898795-zn7t9"] Apr 22 21:11:47.360710 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.360689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.363455 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.363302 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 21:11:47.363455 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.363320 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-x7lvz\"" Apr 22 21:11:47.363455 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.363444 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7rjog9rnr0rp3\"" Apr 22 21:11:47.363773 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.363493 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 21:11:47.364083 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.364057 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 21:11:47.364200 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.364165 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 21:11:47.372238 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.372185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fd898795-zn7t9"] Apr 22 21:11:47.374161 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.374112 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ltfpz" podStartSLOduration=4.3444408 podStartE2EDuration="5.374094738s" podCreationTimestamp="2026-04-22 21:11:42 +0000 UTC" firstStartedPulling="2026-04-22 21:11:43.903196362 +0000 UTC m=+150.647245054" lastFinishedPulling="2026-04-22 21:11:44.932850314 +0000 UTC m=+151.676898992" observedRunningTime="2026-04-22 21:11:47.372567773 +0000 UTC m=+154.116616471" watchObservedRunningTime="2026-04-22 21:11:47.374094738 +0000 UTC m=+154.118143440" Apr 22 21:11:47.449767 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.449707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-tls\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.449947 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.449815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-client-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.449947 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.449912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-metrics-server-audit-profiles\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.450073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.450018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cpfn\" (UniqueName: \"kubernetes.io/projected/510cb629-1f16-4d62-b114-d87845a195c6-kube-api-access-8cpfn\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.450134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.450099 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/510cb629-1f16-4d62-b114-d87845a195c6-audit-log\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.450189 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.450158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-client-certs\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.450238 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.450200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551398 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cpfn\" (UniqueName: \"kubernetes.io/projected/510cb629-1f16-4d62-b114-d87845a195c6-kube-api-access-8cpfn\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551571 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551522 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/510cb629-1f16-4d62-b114-d87845a195c6-audit-log\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-client-certs\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551600 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551751 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-tls\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551751 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-client-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551854 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551784 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-metrics-server-audit-profiles\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.551905 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.551869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/510cb629-1f16-4d62-b114-d87845a195c6-audit-log\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.552436 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.552405 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.552724 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.552701 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/510cb629-1f16-4d62-b114-d87845a195c6-metrics-server-audit-profiles\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.554466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.554444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-tls\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.554614 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.554594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-secret-metrics-server-client-certs\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.554966 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.554946 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/510cb629-1f16-4d62-b114-d87845a195c6-client-ca-bundle\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.559094 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.559069 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cpfn\" (UniqueName: \"kubernetes.io/projected/510cb629-1f16-4d62-b114-d87845a195c6-kube-api-access-8cpfn\") pod \"metrics-server-7fd898795-zn7t9\" (UID: \"510cb629-1f16-4d62-b114-d87845a195c6\") " pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:47.674233 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:47.674152 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:11:48.342288 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:48.342262 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fd898795-zn7t9"] Apr 22 21:11:48.353685 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:48.353645 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod510cb629_1f16_4d62_b114_d87845a195c6.slice/crio-537a4b39bff52bf481e22c74b034d5de7297557e2ecd76cb33eb5fbb654376c7 WatchSource:0}: Error finding container 537a4b39bff52bf481e22c74b034d5de7297557e2ecd76cb33eb5fbb654376c7: Status 404 returned error can't find the container with id 537a4b39bff52bf481e22c74b034d5de7297557e2ecd76cb33eb5fbb654376c7 Apr 22 21:11:48.357818 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:48.357790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" event={"ID":"510cb629-1f16-4d62-b114-d87845a195c6","Type":"ContainerStarted","Data":"537a4b39bff52bf481e22c74b034d5de7297557e2ecd76cb33eb5fbb654376c7"} Apr 22 21:11:48.359459 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:48.359428 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"6816c5c594fb5589d2cd80cabe89d3f2adfe282b7320ec27bffc764efebeb15e"} Apr 22 21:11:48.360983 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:48.360947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"5c711824ce84776347b66a984cc45042860517649a5fd50b15881da5291c34a5"} Apr 22 21:11:49.369466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.369429 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"b2fabe93448748d0686eef03093d9f22da87f63515807ed7751cdb1b4c7cb92d"} Apr 22 21:11:49.369901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.369476 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"e1d2029ece3c15f1a6449d8dd72b769fa1d0f6b4a0bc067b8cf587c0678cdce1"} Apr 22 21:11:49.369901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.369493 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"7300c9515b660d3916cc4c7e9e077b4dd0fb3de54798757459a06e18dbd9d2af"} Apr 22 21:11:49.369901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.369507 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"2ec5089846279cf921c173ab811b54cffa0d8850fd6e31c7acf9748119463d82"} Apr 22 21:11:49.372038 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.371960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"2cfbf378594e84293532f513337cf6023a47d73ee46f2544ab2b1a604e0977ef"} Apr 22 21:11:49.372038 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:49.372001 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"0f079335c13f9cfbc380f951041cf93479e0ca29727cf25dc4b99779e5b18a3d"} Apr 22 21:11:49.606402 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:49.606313 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qvbng" podUID="fd364cec-0032-4596-8b42-09cb588be2ad" Apr 22 21:11:49.621534 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:49.621434 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-m9f26" podUID="cd68afce-7631-4765-af7c-a614caf39491" Apr 22 21:11:50.378018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.377929 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"eb8f7a9427a4e2bfb45cb4573d28fefd59c9f150f62439a3710be363a069db19"} Apr 22 21:11:50.378018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.377978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"72366e00aede3802ab0a0142fb14368257963af2ae53a0be3ea65ac779e438e3"} Apr 22 21:11:50.378018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.377994 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" event={"ID":"4fd1aa70-782d-45cd-8d2d-dd1426761edb","Type":"ContainerStarted","Data":"307fcfd6701925b3b0a133e28a5ba7c41c360ce244c45f0e12b22385c72510be"} Apr 22 21:11:50.378618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.378165 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:50.379446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.379416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" event={"ID":"510cb629-1f16-4d62-b114-d87845a195c6","Type":"ContainerStarted","Data":"d37b2aeb6cbd5deb99d7f7a01b43ec996e1f27d2d2966e2dbb049a8d4556138b"} Apr 22 21:11:50.382367 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.382339 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerStarted","Data":"f5db05833524d331cb7be1dfb60c28c0790a81abda193382745e3b044b774b87"} Apr 22 21:11:50.382502 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.382380 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qvbng" Apr 22 21:11:50.402071 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.402006 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" podStartSLOduration=2.514828613 podStartE2EDuration="5.401987467s" podCreationTimestamp="2026-04-22 21:11:45 +0000 UTC" firstStartedPulling="2026-04-22 21:11:46.401708383 +0000 UTC m=+153.145757066" lastFinishedPulling="2026-04-22 21:11:49.288867232 +0000 UTC m=+156.032915920" observedRunningTime="2026-04-22 21:11:50.400144658 +0000 UTC m=+157.144193362" watchObservedRunningTime="2026-04-22 21:11:50.401987467 +0000 UTC m=+157.146036168" Apr 22 21:11:50.424634 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.424582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.717129966 podStartE2EDuration="6.424566128s" podCreationTimestamp="2026-04-22 21:11:44 +0000 UTC" firstStartedPulling="2026-04-22 21:11:44.596281073 +0000 UTC m=+151.340329756" lastFinishedPulling="2026-04-22 21:11:49.303717241 +0000 UTC m=+156.047765918" observedRunningTime="2026-04-22 21:11:50.422885599 +0000 UTC m=+157.166934298" watchObservedRunningTime="2026-04-22 21:11:50.424566128 +0000 UTC m=+157.168614828" Apr 22 21:11:50.444655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:50.444590 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" podStartSLOduration=1.857485752 podStartE2EDuration="3.444569717s" podCreationTimestamp="2026-04-22 21:11:47 +0000 UTC" firstStartedPulling="2026-04-22 21:11:48.356000164 +0000 UTC m=+155.100048842" lastFinishedPulling="2026-04-22 21:11:49.943084115 +0000 UTC m=+156.687132807" observedRunningTime="2026-04-22 21:11:50.442633283 +0000 UTC m=+157.186681983" watchObservedRunningTime="2026-04-22 21:11:50.444569717 +0000 UTC m=+157.188618418" Apr 22 21:11:50.848518 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:50.848470 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-hptqt" podUID="605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f" Apr 22 21:11:53.837735 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:53.837688 2568 scope.go:117] "RemoveContainer" containerID="5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540" Apr 22 21:11:53.838168 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:11:53.837927 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-sc42w_openshift-console-operator(33c46c61-c2a6-4c05-bf38-c25734b80329)\"" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podUID="33c46c61-c2a6-4c05-bf38-c25734b80329" Apr 22 21:11:54.524479 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.524435 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:11:54.524637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.524503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:11:54.526899 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.526871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd364cec-0032-4596-8b42-09cb588be2ad-metrics-tls\") pod \"dns-default-qvbng\" (UID: \"fd364cec-0032-4596-8b42-09cb588be2ad\") " pod="openshift-dns/dns-default-qvbng" Apr 22 21:11:54.527021 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.526964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cd68afce-7631-4765-af7c-a614caf39491-cert\") pod \"ingress-canary-m9f26\" (UID: \"cd68afce-7631-4765-af7c-a614caf39491\") " pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:11:54.585496 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.585465 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8d48v\"" Apr 22 21:11:54.593580 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.593549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qvbng" Apr 22 21:11:54.715865 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:54.715771 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qvbng"] Apr 22 21:11:54.718309 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:11:54.718285 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd364cec_0032_4596_8b42_09cb588be2ad.slice/crio-1e2167c3978769af5e1351c6ce4a03d10fbcfafe373f52ac80c8a933c52fb6f5 WatchSource:0}: Error finding container 1e2167c3978769af5e1351c6ce4a03d10fbcfafe373f52ac80c8a933c52fb6f5: Status 404 returned error can't find the container with id 1e2167c3978769af5e1351c6ce4a03d10fbcfafe373f52ac80c8a933c52fb6f5 Apr 22 21:11:55.399634 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:55.399594 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvbng" event={"ID":"fd364cec-0032-4596-8b42-09cb588be2ad","Type":"ContainerStarted","Data":"1e2167c3978769af5e1351c6ce4a03d10fbcfafe373f52ac80c8a933c52fb6f5"} Apr 22 21:11:56.390903 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:56.390876 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-69bdf86478-lhs6r" Apr 22 21:11:56.403819 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:56.403791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvbng" event={"ID":"fd364cec-0032-4596-8b42-09cb588be2ad","Type":"ContainerStarted","Data":"d2d83ef01649811decda0400e069b483dc4f50169ff46bb908468cb25b5d1780"} Apr 22 21:11:56.404185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:56.403827 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvbng" event={"ID":"fd364cec-0032-4596-8b42-09cb588be2ad","Type":"ContainerStarted","Data":"3a432dd9d5c279de9e1479d8c80c13ac06bd61ef94b446136e0ada1919e92b58"} Apr 22 21:11:56.404185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:56.403909 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qvbng" Apr 22 21:11:56.429073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:11:56.428835 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qvbng" podStartSLOduration=129.238705344 podStartE2EDuration="2m10.428813637s" podCreationTimestamp="2026-04-22 21:09:46 +0000 UTC" firstStartedPulling="2026-04-22 21:11:54.720200662 +0000 UTC m=+161.464249339" lastFinishedPulling="2026-04-22 21:11:55.910308956 +0000 UTC m=+162.654357632" observedRunningTime="2026-04-22 21:11:56.427615573 +0000 UTC m=+163.171664273" watchObservedRunningTime="2026-04-22 21:11:56.428813637 +0000 UTC m=+163.172862337" Apr 22 21:12:04.835775 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:04.835684 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:12:04.836116 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:04.835861 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:12:04.838363 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:04.838344 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tq52g\"" Apr 22 21:12:04.846421 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:04.846374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m9f26" Apr 22 21:12:04.966994 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:04.966968 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m9f26"] Apr 22 21:12:04.969488 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:12:04.969459 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd68afce_7631_4765_af7c_a614caf39491.slice/crio-c0deed121b92bc8a364b3b8cba02a3e6b6856b434ecafefcbb974c1e53db334d WatchSource:0}: Error finding container c0deed121b92bc8a364b3b8cba02a3e6b6856b434ecafefcbb974c1e53db334d: Status 404 returned error can't find the container with id c0deed121b92bc8a364b3b8cba02a3e6b6856b434ecafefcbb974c1e53db334d Apr 22 21:12:05.434201 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:05.434161 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m9f26" event={"ID":"cd68afce-7631-4765-af7c-a614caf39491","Type":"ContainerStarted","Data":"c0deed121b92bc8a364b3b8cba02a3e6b6856b434ecafefcbb974c1e53db334d"} Apr 22 21:12:05.836414 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:05.836244 2568 scope.go:117] "RemoveContainer" containerID="5219c732fd02ba9522e7a6c1ab1489d82511c1b52ebb28f606cd538422efc540" Apr 22 21:12:06.410628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.410598 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qvbng" Apr 22 21:12:06.452536 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.452506 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:12:06.452736 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.452608 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" event={"ID":"33c46c61-c2a6-4c05-bf38-c25734b80329","Type":"ContainerStarted","Data":"f48be5e254c0d577973e2f8e7756f602d163dabf40d46f91eaf3f2872fa03bbf"} Apr 22 21:12:06.453134 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.453095 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:12:06.468359 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.468304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" podStartSLOduration=55.712643875 podStartE2EDuration="57.468285442s" podCreationTimestamp="2026-04-22 21:11:09 +0000 UTC" firstStartedPulling="2026-04-22 21:11:10.052662195 +0000 UTC m=+116.796710872" lastFinishedPulling="2026-04-22 21:11:11.808303747 +0000 UTC m=+118.552352439" observedRunningTime="2026-04-22 21:12:06.467791221 +0000 UTC m=+173.211839921" watchObservedRunningTime="2026-04-22 21:12:06.468285442 +0000 UTC m=+173.212334146" Apr 22 21:12:06.497611 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.497579 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-sc42w" Apr 22 21:12:06.664174 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.664088 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-9gw7r"] Apr 22 21:12:06.668897 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.668864 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:06.671555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.671528 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2hpdn\"" Apr 22 21:12:06.671555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.671528 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:12:06.671751 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.671705 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:12:06.679692 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.679668 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-9gw7r"] Apr 22 21:12:06.835991 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.835951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbd6\" (UniqueName: \"kubernetes.io/projected/d077eb7b-f486-4c86-a813-5f7063fd7016-kube-api-access-dkbd6\") pod \"downloads-6bcc868b7-9gw7r\" (UID: \"d077eb7b-f486-4c86-a813-5f7063fd7016\") " pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:06.937401 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.937359 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbd6\" (UniqueName: \"kubernetes.io/projected/d077eb7b-f486-4c86-a813-5f7063fd7016-kube-api-access-dkbd6\") pod \"downloads-6bcc868b7-9gw7r\" (UID: \"d077eb7b-f486-4c86-a813-5f7063fd7016\") " pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:06.945201 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.945177 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbd6\" (UniqueName: \"kubernetes.io/projected/d077eb7b-f486-4c86-a813-5f7063fd7016-kube-api-access-dkbd6\") pod \"downloads-6bcc868b7-9gw7r\" (UID: \"d077eb7b-f486-4c86-a813-5f7063fd7016\") " pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:06.980850 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:06.980811 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:07.107695 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.107647 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-9gw7r"] Apr 22 21:12:07.111727 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:12:07.111699 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd077eb7b_f486_4c86_a813_5f7063fd7016.slice/crio-d88ec32894b5b57c6aee3fdd1c977e3e49579d57bd05e3f565ab8c86029f020e WatchSource:0}: Error finding container d88ec32894b5b57c6aee3fdd1c977e3e49579d57bd05e3f565ab8c86029f020e: Status 404 returned error can't find the container with id d88ec32894b5b57c6aee3fdd1c977e3e49579d57bd05e3f565ab8c86029f020e Apr 22 21:12:07.457173 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.457133 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-9gw7r" event={"ID":"d077eb7b-f486-4c86-a813-5f7063fd7016","Type":"ContainerStarted","Data":"d88ec32894b5b57c6aee3fdd1c977e3e49579d57bd05e3f565ab8c86029f020e"} Apr 22 21:12:07.458655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.458626 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m9f26" event={"ID":"cd68afce-7631-4765-af7c-a614caf39491","Type":"ContainerStarted","Data":"759e6fb65dfd111a25131d33691d7d5fdaf5bdb3cbffacd5e65607e4469f5b86"} Apr 22 21:12:07.474844 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.474793 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m9f26" podStartSLOduration=139.579615862 podStartE2EDuration="2m21.474777605s" podCreationTimestamp="2026-04-22 21:09:46 +0000 UTC" firstStartedPulling="2026-04-22 21:12:04.97142617 +0000 UTC m=+171.715474846" lastFinishedPulling="2026-04-22 21:12:06.866587894 +0000 UTC m=+173.610636589" observedRunningTime="2026-04-22 21:12:07.47364582 +0000 UTC m=+174.217694521" watchObservedRunningTime="2026-04-22 21:12:07.474777605 +0000 UTC m=+174.218826304" Apr 22 21:12:07.674604 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.674558 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:12:07.674789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:07.674618 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:12:20.174473 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.174435 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:12:20.193820 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.193788 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:12:20.193989 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.193929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197559 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197610 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197621 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197687 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-q7nd6\"" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197567 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 21:12:20.197826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.197624 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 21:12:20.260674 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.260635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.260867 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.260733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldngt\" (UniqueName: \"kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.260940 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.260868 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.260940 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.260910 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.261061 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.260970 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.261061 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.261004 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.361676 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.361888 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361694 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.361888 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361785 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.361888 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.362046 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldngt\" (UniqueName: \"kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.362046 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.361999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.362650 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.362622 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.362650 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.362636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.362872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.362716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.364610 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.364586 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.364763 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.364742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.369845 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.369818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldngt\" (UniqueName: \"kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt\") pod \"console-5f9cd5c99d-cqfbd\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:20.506349 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:20.506262 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:23.136955 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.136925 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:12:23.147608 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:12:23.147576 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7827c1f6_13ab_4f58_8ccb_4e9942c15f4a.slice/crio-a1a183f83d85980421df2d2e69f3bccd55eaa809b4b9e9d63272f2bef0291c56 WatchSource:0}: Error finding container a1a183f83d85980421df2d2e69f3bccd55eaa809b4b9e9d63272f2bef0291c56: Status 404 returned error can't find the container with id a1a183f83d85980421df2d2e69f3bccd55eaa809b4b9e9d63272f2bef0291c56 Apr 22 21:12:23.512352 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.512314 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-9gw7r" event={"ID":"d077eb7b-f486-4c86-a813-5f7063fd7016","Type":"ContainerStarted","Data":"6a41e2faf6f923fdc9d59dadbc9e13c9ea9fe186733bf11cda1baffd50f3743f"} Apr 22 21:12:23.512641 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.512586 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:23.514327 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.514291 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9cd5c99d-cqfbd" event={"ID":"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a","Type":"ContainerStarted","Data":"a1a183f83d85980421df2d2e69f3bccd55eaa809b4b9e9d63272f2bef0291c56"} Apr 22 21:12:23.520963 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.520935 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-9gw7r" Apr 22 21:12:23.532243 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:23.532184 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-9gw7r" podStartSLOduration=1.5377217060000001 podStartE2EDuration="17.532165624s" podCreationTimestamp="2026-04-22 21:12:06 +0000 UTC" firstStartedPulling="2026-04-22 21:12:07.114001147 +0000 UTC m=+173.858049824" lastFinishedPulling="2026-04-22 21:12:23.108445061 +0000 UTC m=+189.852493742" observedRunningTime="2026-04-22 21:12:23.530778259 +0000 UTC m=+190.274826959" watchObservedRunningTime="2026-04-22 21:12:23.532165624 +0000 UTC m=+190.276214324" Apr 22 21:12:27.531186 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:27.531142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9cd5c99d-cqfbd" event={"ID":"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a","Type":"ContainerStarted","Data":"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758"} Apr 22 21:12:27.549547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:27.549491 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f9cd5c99d-cqfbd" podStartSLOduration=4.181156436 podStartE2EDuration="7.549473671s" podCreationTimestamp="2026-04-22 21:12:20 +0000 UTC" firstStartedPulling="2026-04-22 21:12:23.149627353 +0000 UTC m=+189.893676030" lastFinishedPulling="2026-04-22 21:12:26.517944576 +0000 UTC m=+193.261993265" observedRunningTime="2026-04-22 21:12:27.547170355 +0000 UTC m=+194.291219069" watchObservedRunningTime="2026-04-22 21:12:27.549473671 +0000 UTC m=+194.293522372" Apr 22 21:12:27.681099 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:27.681067 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:12:27.685717 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:27.685687 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7fd898795-zn7t9" Apr 22 21:12:29.574968 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.574933 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:12:29.599189 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.599151 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:12:29.599379 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.599294 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.607917 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.607466 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 21:12:29.659647 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659832 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659832 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659832 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659723 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659840 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659896 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.659987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.659960 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrr6r\" (UniqueName: \"kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761120 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761120 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761145 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761321 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrr6r\" (UniqueName: \"kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.761943 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.762070 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.761967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.762141 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.762103 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.762185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.762159 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.763958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.763938 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.764206 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.764184 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.769186 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.769161 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrr6r\" (UniqueName: \"kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r\") pod \"console-d9c597cdd-6kmd2\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:29.911501 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:29.911460 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:30.055257 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.055229 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:12:30.058423 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:12:30.058374 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75f4221_900c_41b9_a2d4_08ffbe740edc.slice/crio-b86d1ebfc12c1033a25a19e44113e0f939addb93f297ede8e2c11333a44e4f63 WatchSource:0}: Error finding container b86d1ebfc12c1033a25a19e44113e0f939addb93f297ede8e2c11333a44e4f63: Status 404 returned error can't find the container with id b86d1ebfc12c1033a25a19e44113e0f939addb93f297ede8e2c11333a44e4f63 Apr 22 21:12:30.506528 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.506441 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:30.506528 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.506492 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:30.511949 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.511922 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:30.544336 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.544296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9c597cdd-6kmd2" event={"ID":"c75f4221-900c-41b9-a2d4-08ffbe740edc","Type":"ContainerStarted","Data":"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318"} Apr 22 21:12:30.544336 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.544343 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9c597cdd-6kmd2" event={"ID":"c75f4221-900c-41b9-a2d4-08ffbe740edc","Type":"ContainerStarted","Data":"b86d1ebfc12c1033a25a19e44113e0f939addb93f297ede8e2c11333a44e4f63"} Apr 22 21:12:30.548722 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.548689 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:12:30.560366 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:30.560317 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d9c597cdd-6kmd2" podStartSLOduration=1.560301285 podStartE2EDuration="1.560301285s" podCreationTimestamp="2026-04-22 21:12:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:12:30.559698049 +0000 UTC m=+197.303746747" watchObservedRunningTime="2026-04-22 21:12:30.560301285 +0000 UTC m=+197.304349982" Apr 22 21:12:39.911780 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:39.911728 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:39.911780 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:39.911786 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:39.916528 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:39.916504 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:40.579381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:40.579347 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:12:40.622919 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:12:40.622885 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:13:03.140789 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.140746 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:03.141445 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141357 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="alertmanager" containerID="cri-o://6816c5c594fb5589d2cd80cabe89d3f2adfe282b7320ec27bffc764efebeb15e" gracePeriod=120 Apr 22 21:13:03.141607 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141431 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-web" containerID="cri-o://7300c9515b660d3916cc4c7e9e077b4dd0fb3de54798757459a06e18dbd9d2af" gracePeriod=120 Apr 22 21:13:03.141607 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141485 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="config-reloader" containerID="cri-o://2ec5089846279cf921c173ab811b54cffa0d8850fd6e31c7acf9748119463d82" gracePeriod=120 Apr 22 21:13:03.141607 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141492 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy" containerID="cri-o://e1d2029ece3c15f1a6449d8dd72b769fa1d0f6b4a0bc067b8cf587c0678cdce1" gracePeriod=120 Apr 22 21:13:03.141607 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141499 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="prom-label-proxy" containerID="cri-o://f5db05833524d331cb7be1dfb60c28c0790a81abda193382745e3b044b774b87" gracePeriod=120 Apr 22 21:13:03.141840 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.141432 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-metric" containerID="cri-o://b2fabe93448748d0686eef03093d9f22da87f63515807ed7751cdb1b4c7cb92d" gracePeriod=120 Apr 22 21:13:03.651133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651095 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="f5db05833524d331cb7be1dfb60c28c0790a81abda193382745e3b044b774b87" exitCode=0 Apr 22 21:13:03.651133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651125 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="e1d2029ece3c15f1a6449d8dd72b769fa1d0f6b4a0bc067b8cf587c0678cdce1" exitCode=0 Apr 22 21:13:03.651133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651135 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="2ec5089846279cf921c173ab811b54cffa0d8850fd6e31c7acf9748119463d82" exitCode=0 Apr 22 21:13:03.651133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651142 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="6816c5c594fb5589d2cd80cabe89d3f2adfe282b7320ec27bffc764efebeb15e" exitCode=0 Apr 22 21:13:03.651418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651165 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"f5db05833524d331cb7be1dfb60c28c0790a81abda193382745e3b044b774b87"} Apr 22 21:13:03.651418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651199 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"e1d2029ece3c15f1a6449d8dd72b769fa1d0f6b4a0bc067b8cf587c0678cdce1"} Apr 22 21:13:03.651418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"2ec5089846279cf921c173ab811b54cffa0d8850fd6e31c7acf9748119463d82"} Apr 22 21:13:03.651418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:03.651217 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"6816c5c594fb5589d2cd80cabe89d3f2adfe282b7320ec27bffc764efebeb15e"} Apr 22 21:13:04.657062 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.657024 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="b2fabe93448748d0686eef03093d9f22da87f63515807ed7751cdb1b4c7cb92d" exitCode=0 Apr 22 21:13:04.657062 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.657050 2568 generic.go:358] "Generic (PLEG): container finished" podID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerID="7300c9515b660d3916cc4c7e9e077b4dd0fb3de54798757459a06e18dbd9d2af" exitCode=0 Apr 22 21:13:04.657510 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.657089 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"b2fabe93448748d0686eef03093d9f22da87f63515807ed7751cdb1b4c7cb92d"} Apr 22 21:13:04.657510 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.657124 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"7300c9515b660d3916cc4c7e9e077b4dd0fb3de54798757459a06e18dbd9d2af"} Apr 22 21:13:04.896553 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.896524 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:04.993111 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993011 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcft\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993111 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993067 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993111 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993096 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993124 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993163 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993186 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993210 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993254 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993310 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993352 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993415 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993377 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993432 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993478 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config\") pod \"38a370fd-e71f-40a5-a8df-f5254fc455df\" (UID: \"38a370fd-e71f-40a5-a8df-f5254fc455df\") " Apr 22 21:13:04.993779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993515 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:04.993779 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993574 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:04.993977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993793 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-main-db\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:04.993977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993812 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:04.993977 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.993865 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:04.996311 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.996278 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:04.996515 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.996368 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:04.996515 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.996452 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft" (OuterVolumeSpecName: "kube-api-access-xdcft") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "kube-api-access-xdcft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:04.996515 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.996464 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out" (OuterVolumeSpecName: "config-out") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:04.997687 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.997659 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:04.997795 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.997690 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:04.997941 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.997913 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:04.997941 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:04.997931 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume" (OuterVolumeSpecName: "config-volume") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:05.001018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.000987 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:05.008669 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.008637 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config" (OuterVolumeSpecName: "web-config") pod "38a370fd-e71f-40a5-a8df-f5254fc455df" (UID: "38a370fd-e71f-40a5-a8df-f5254fc455df"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:05.094516 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094478 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdcft\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-kube-api-access-xdcft\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094516 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094508 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/38a370fd-e71f-40a5-a8df-f5254fc455df-tls-assets\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094516 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094521 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094531 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094541 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/38a370fd-e71f-40a5-a8df-f5254fc455df-metrics-client-ca\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094550 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-main-tls\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094559 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-web-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094569 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094577 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-config-volume\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094587 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/38a370fd-e71f-40a5-a8df-f5254fc455df-config-out\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.094747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.094595 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/38a370fd-e71f-40a5-a8df-f5254fc455df-cluster-tls-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:05.642648 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.642581 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f9cd5c99d-cqfbd" podUID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" containerName="console" containerID="cri-o://5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758" gracePeriod=15 Apr 22 21:13:05.663123 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.663090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"38a370fd-e71f-40a5-a8df-f5254fc455df","Type":"ContainerDied","Data":"3ef9f84470e15ef7a92761273774bf415783451a7e8f4b13204f5dc58ccafc20"} Apr 22 21:13:05.663500 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.663142 2568 scope.go:117] "RemoveContainer" containerID="f5db05833524d331cb7be1dfb60c28c0790a81abda193382745e3b044b774b87" Apr 22 21:13:05.663500 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.663174 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.712635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.712595 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:05.717055 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.717024 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:05.739701 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.739669 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:05.740035 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740024 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="config-reloader" Apr 22 21:13:05.740075 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740037 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="config-reloader" Apr 22 21:13:05.740075 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740047 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy" Apr 22 21:13:05.740075 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740053 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy" Apr 22 21:13:05.740075 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740061 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-metric" Apr 22 21:13:05.740075 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740069 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-metric" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740080 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="alertmanager" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740088 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="alertmanager" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740106 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="init-config-reloader" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740112 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="init-config-reloader" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740123 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-web" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740129 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-web" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740134 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="prom-label-proxy" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740139 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="prom-label-proxy" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740195 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-web" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740203 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy-metric" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740211 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="prom-label-proxy" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740217 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="kube-rbac-proxy" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740224 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="alertmanager" Apr 22 21:13:05.740226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.740231 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" containerName="config-reloader" Apr 22 21:13:05.753000 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.752970 2568 scope.go:117] "RemoveContainer" containerID="b2fabe93448748d0686eef03093d9f22da87f63515807ed7751cdb1b4c7cb92d" Apr 22 21:13:05.761077 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.760928 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:05.761240 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.761109 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.763500 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763407 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 21:13:05.763875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763540 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 21:13:05.763875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763609 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 21:13:05.763875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763678 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 21:13:05.763875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763695 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 21:13:05.764103 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.763970 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 21:13:05.764161 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.764127 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 21:13:05.764463 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.764293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fxmp4\"" Apr 22 21:13:05.764463 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.764345 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 21:13:05.766656 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.766632 2568 scope.go:117] "RemoveContainer" containerID="e1d2029ece3c15f1a6449d8dd72b769fa1d0f6b4a0bc067b8cf587c0678cdce1" Apr 22 21:13:05.768618 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.768591 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 21:13:05.778377 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.778358 2568 scope.go:117] "RemoveContainer" containerID="7300c9515b660d3916cc4c7e9e077b4dd0fb3de54798757459a06e18dbd9d2af" Apr 22 21:13:05.786214 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.786195 2568 scope.go:117] "RemoveContainer" containerID="2ec5089846279cf921c173ab811b54cffa0d8850fd6e31c7acf9748119463d82" Apr 22 21:13:05.793097 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.793074 2568 scope.go:117] "RemoveContainer" containerID="6816c5c594fb5589d2cd80cabe89d3f2adfe282b7320ec27bffc764efebeb15e" Apr 22 21:13:05.799507 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799481 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799608 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799527 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799608 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799581 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799681 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-config-out\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799681 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799645 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799755 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799684 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799755 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799707 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799755 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-tls-assets\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799766 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799782 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-config-volume\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z282m\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-kube-api-access-z282m\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799818 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.799872 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.799837 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-web-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.800670 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.800655 2568 scope.go:117] "RemoveContainer" containerID="8e231e7bc3e90f7f51c8f6ca1c70a1a66d5315b1d2ff809cad28d41803c93534" Apr 22 21:13:05.841423 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.841375 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a370fd-e71f-40a5-a8df-f5254fc455df" path="/var/lib/kubelet/pods/38a370fd-e71f-40a5-a8df-f5254fc455df/volumes" Apr 22 21:13:05.900885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900811 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.900885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-config-volume\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z282m\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-kube-api-access-z282m\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900924 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-web-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.900999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901081 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-config-out\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901236 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901265 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-tls-assets\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.901862 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.901642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.902987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.902671 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.902987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.902941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/229bff03-46de-4d1a-b214-0574275ea562-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.903983 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.903947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-tls-assets\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.904890 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.904845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-web-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.904989 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.904958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.905077 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.905062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.905164 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.905134 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.905655 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.905620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/229bff03-46de-4d1a-b214-0574275ea562-config-out\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.905761 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.905708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-config-volume\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.905963 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.905941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.906385 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.906370 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/229bff03-46de-4d1a-b214-0574275ea562-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.909243 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.909215 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z282m\" (UniqueName: \"kubernetes.io/projected/229bff03-46de-4d1a-b214-0574275ea562-kube-api-access-z282m\") pod \"alertmanager-main-0\" (UID: \"229bff03-46de-4d1a-b214-0574275ea562\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:05.926370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.926350 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f9cd5c99d-cqfbd_7827c1f6-13ab-4f58-8ccb-4e9942c15f4a/console/0.log" Apr 22 21:13:05.926507 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:05.926433 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:13:06.002450 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002412 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.002450 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002454 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldngt\" (UniqueName: \"kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.002754 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002475 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.002754 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002494 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.002754 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002563 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.002754 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002583 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config\") pod \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\" (UID: \"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a\") " Apr 22 21:13:06.003026 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.002898 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca" (OuterVolumeSpecName: "service-ca") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:06.003092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.003027 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:06.003092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.003033 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config" (OuterVolumeSpecName: "console-config") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:06.004912 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.004876 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:06.005032 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.004944 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt" (OuterVolumeSpecName: "kube-api-access-ldngt") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "kube-api-access-ldngt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:06.005426 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.005404 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" (UID: "7827c1f6-13ab-4f58-8ccb-4e9942c15f4a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:06.073155 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.073114 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:13:06.103547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103505 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-oauth-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.103547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103547 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.103733 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103562 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-service-ca\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.103733 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103576 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldngt\" (UniqueName: \"kubernetes.io/projected/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-kube-api-access-ldngt\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.103733 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103592 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-console-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.103733 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.103605 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a-oauth-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.219256 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.219223 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:13:06.223421 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:13:06.223372 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod229bff03_46de_4d1a_b214_0574275ea562.slice/crio-90e99f21f169b91fc01ea0892fc9b232840ad3c9eac32ab9798d34b29d41e3b8 WatchSource:0}: Error finding container 90e99f21f169b91fc01ea0892fc9b232840ad3c9eac32ab9798d34b29d41e3b8: Status 404 returned error can't find the container with id 90e99f21f169b91fc01ea0892fc9b232840ad3c9eac32ab9798d34b29d41e3b8 Apr 22 21:13:06.668535 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668500 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f9cd5c99d-cqfbd_7827c1f6-13ab-4f58-8ccb-4e9942c15f4a/console/0.log" Apr 22 21:13:06.668951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668544 2568 generic.go:358] "Generic (PLEG): container finished" podID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" containerID="5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758" exitCode=2 Apr 22 21:13:06.668951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668610 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f9cd5c99d-cqfbd" Apr 22 21:13:06.668951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668629 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9cd5c99d-cqfbd" event={"ID":"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a","Type":"ContainerDied","Data":"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758"} Apr 22 21:13:06.668951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668667 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f9cd5c99d-cqfbd" event={"ID":"7827c1f6-13ab-4f58-8ccb-4e9942c15f4a","Type":"ContainerDied","Data":"a1a183f83d85980421df2d2e69f3bccd55eaa809b4b9e9d63272f2bef0291c56"} Apr 22 21:13:06.668951 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.668683 2568 scope.go:117] "RemoveContainer" containerID="5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758" Apr 22 21:13:06.669955 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.669929 2568 generic.go:358] "Generic (PLEG): container finished" podID="229bff03-46de-4d1a-b214-0574275ea562" containerID="c7a24ece1a9b6cedd1b58fe3c0222c9abb919ef3c1da321fff7a3372f0eb251d" exitCode=0 Apr 22 21:13:06.670070 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.669975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerDied","Data":"c7a24ece1a9b6cedd1b58fe3c0222c9abb919ef3c1da321fff7a3372f0eb251d"} Apr 22 21:13:06.670070 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.670007 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"90e99f21f169b91fc01ea0892fc9b232840ad3c9eac32ab9798d34b29d41e3b8"} Apr 22 21:13:06.677910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.677894 2568 scope.go:117] "RemoveContainer" containerID="5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758" Apr 22 21:13:06.678193 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:13:06.678164 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758\": container with ID starting with 5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758 not found: ID does not exist" containerID="5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758" Apr 22 21:13:06.678337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.678197 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758"} err="failed to get container status \"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758\": rpc error: code = NotFound desc = could not find container \"5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758\": container with ID starting with 5eeed089e81c1a7d3d90c507db16c29e0acfa3d1c6934556d50807009d995758 not found: ID does not exist" Apr 22 21:13:06.713123 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.713095 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:13:06.716536 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:06.716509 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f9cd5c99d-cqfbd"] Apr 22 21:13:07.173643 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.173610 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g"] Apr 22 21:13:07.174045 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.174030 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" containerName="console" Apr 22 21:13:07.174045 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.174047 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" containerName="console" Apr 22 21:13:07.174152 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.174128 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" containerName="console" Apr 22 21:13:07.206778 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.206751 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g"] Apr 22 21:13:07.206915 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.206906 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.209480 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.209444 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 21:13:07.209651 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.209625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 21:13:07.209758 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.209708 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 21:13:07.210513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.209924 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-pkzlb\"" Apr 22 21:13:07.210513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.210144 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 21:13:07.210513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.210364 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 21:13:07.215304 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.215277 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 21:13:07.315488 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-serving-certs-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315660 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315501 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315660 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315592 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315660 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-federate-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315859 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjrzz\" (UniqueName: \"kubernetes.io/projected/16f44f55-57f0-4a60-b4a0-58b6921842e0-kube-api-access-kjrzz\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315859 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315693 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-metrics-client-ca\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315859 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315718 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.315859 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.315762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417043 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-metrics-client-ca\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417205 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417205 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417205 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-serving-certs-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417249 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-federate-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjrzz\" (UniqueName: \"kubernetes.io/projected/16f44f55-57f0-4a60-b4a0-58b6921842e0-kube-api-access-kjrzz\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.417875 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417835 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-metrics-client-ca\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.418001 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.417935 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-serving-certs-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.418067 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.418026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.420141 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.420115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-telemeter-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.420287 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.420250 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.420502 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.420482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-federate-client-tls\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.420658 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.420641 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/16f44f55-57f0-4a60-b4a0-58b6921842e0-secret-telemeter-client\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.424902 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.424879 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjrzz\" (UniqueName: \"kubernetes.io/projected/16f44f55-57f0-4a60-b4a0-58b6921842e0-kube-api-access-kjrzz\") pod \"telemeter-client-54bbfcf75c-kqd4g\" (UID: \"16f44f55-57f0-4a60-b4a0-58b6921842e0\") " pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.518832 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.518797 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" Apr 22 21:13:07.660065 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.660031 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g"] Apr 22 21:13:07.664241 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:13:07.664212 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f44f55_57f0_4a60_b4a0_58b6921842e0.slice/crio-00741cd66fe93f16e94f4b73e53c7b2ad91f727c7e9328204e97b6c750635bd4 WatchSource:0}: Error finding container 00741cd66fe93f16e94f4b73e53c7b2ad91f727c7e9328204e97b6c750635bd4: Status 404 returned error can't find the container with id 00741cd66fe93f16e94f4b73e53c7b2ad91f727c7e9328204e97b6c750635bd4 Apr 22 21:13:07.673924 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.673894 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" event={"ID":"16f44f55-57f0-4a60-b4a0-58b6921842e0","Type":"ContainerStarted","Data":"00741cd66fe93f16e94f4b73e53c7b2ad91f727c7e9328204e97b6c750635bd4"} Apr 22 21:13:07.679561 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679532 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"74143ff9b46fc01ace47bb10199b8f6d5ca7f233ff91563d927bf0c8aded8c0a"} Apr 22 21:13:07.679682 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679568 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"fd3611387d3fcb67864f4fbbb2a71e8cfa47b00c914cfcd9ba4893944af55b3f"} Apr 22 21:13:07.679682 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679582 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"eac9c9407b9c1d7f8a9fb47f9aa1affa944c21fcd5101059302cb6ade40022bc"} Apr 22 21:13:07.679682 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"5e8082e1559cac55d94ce14cfcf52bff0295f61b7f2be5cc92d4c44f00d42d3a"} Apr 22 21:13:07.679682 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"4709beacd0c15de843bed0de8db094aad5097e83188dc45c8c47098e4f04460d"} Apr 22 21:13:07.679682 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.679609 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"229bff03-46de-4d1a-b214-0574275ea562","Type":"ContainerStarted","Data":"43505ebe8a96cdfa2cf8e653b8bdc3ba99c286131f6b4d459bb028139a75be44"} Apr 22 21:13:07.703659 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.703549 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.70353042 podStartE2EDuration="2.70353042s" podCreationTimestamp="2026-04-22 21:13:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:13:07.703166575 +0000 UTC m=+234.447215273" watchObservedRunningTime="2026-04-22 21:13:07.70353042 +0000 UTC m=+234.447579122" Apr 22 21:13:07.839969 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:07.839879 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7827c1f6-13ab-4f58-8ccb-4e9942c15f4a" path="/var/lib/kubelet/pods/7827c1f6-13ab-4f58-8ccb-4e9942c15f4a/volumes" Apr 22 21:13:09.692055 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:09.692021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" event={"ID":"16f44f55-57f0-4a60-b4a0-58b6921842e0","Type":"ContainerStarted","Data":"a7e41b18eadc47da558c4767885fff034d674b0f598ef1df5252e5225833fb56"} Apr 22 21:13:09.692055 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:09.692060 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" event={"ID":"16f44f55-57f0-4a60-b4a0-58b6921842e0","Type":"ContainerStarted","Data":"d0800f08f9c99eb94492d5bdfdb3cbb4f25679c34c19a108d9f7bf268c44ca95"} Apr 22 21:13:09.692532 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:09.692070 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" event={"ID":"16f44f55-57f0-4a60-b4a0-58b6921842e0","Type":"ContainerStarted","Data":"852523ac3a6c45c1893b8033d45a9e5f61f8e33d03f7a5cf38e3847383e0a70e"} Apr 22 21:13:09.718035 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:09.717985 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54bbfcf75c-kqd4g" podStartSLOduration=0.898748575 podStartE2EDuration="2.717971815s" podCreationTimestamp="2026-04-22 21:13:07 +0000 UTC" firstStartedPulling="2026-04-22 21:13:07.666182745 +0000 UTC m=+234.410231427" lastFinishedPulling="2026-04-22 21:13:09.485405972 +0000 UTC m=+236.229454667" observedRunningTime="2026-04-22 21:13:09.717683365 +0000 UTC m=+236.461732076" watchObservedRunningTime="2026-04-22 21:13:09.717971815 +0000 UTC m=+236.462020513" Apr 22 21:13:10.769640 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.769603 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:13:10.773345 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.773317 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.784294 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.784266 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:13:10.850791 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.850791 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850793 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.850997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.850997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850861 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.850997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.850997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850955 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.851119 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.850998 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8mql\" (UniqueName: \"kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952411 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952346 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952631 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952631 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952493 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952631 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952631 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8mql\" (UniqueName: \"kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952842 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952674 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.952842 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.952702 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.953418 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.953374 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.953511 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.953385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.953511 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.953416 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.953595 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.953548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.955567 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.955537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.955649 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.955549 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:10.961635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:10.961615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8mql\" (UniqueName: \"kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql\") pod \"console-cdd5b779d-sqbnw\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:11.084895 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:11.084797 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:11.212579 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:11.212546 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:13:11.216256 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:13:11.216228 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f52de9_0b0f_4870_a2d8_42ae69b96e3d.slice/crio-ab40eca6cfcf4ab8b5ba052773c0aa87e7cb65f371a3fbe1200373129cdc916b WatchSource:0}: Error finding container ab40eca6cfcf4ab8b5ba052773c0aa87e7cb65f371a3fbe1200373129cdc916b: Status 404 returned error can't find the container with id ab40eca6cfcf4ab8b5ba052773c0aa87e7cb65f371a3fbe1200373129cdc916b Apr 22 21:13:11.700885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:11.700851 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdd5b779d-sqbnw" event={"ID":"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d","Type":"ContainerStarted","Data":"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6"} Apr 22 21:13:11.700885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:11.700888 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdd5b779d-sqbnw" event={"ID":"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d","Type":"ContainerStarted","Data":"ab40eca6cfcf4ab8b5ba052773c0aa87e7cb65f371a3fbe1200373129cdc916b"} Apr 22 21:13:21.085941 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.085901 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:21.086369 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.085974 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:21.090645 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.090623 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:21.107807 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.107758 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-cdd5b779d-sqbnw" podStartSLOduration=11.107743284 podStartE2EDuration="11.107743284s" podCreationTimestamp="2026-04-22 21:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:13:11.717325786 +0000 UTC m=+238.461374486" watchObservedRunningTime="2026-04-22 21:13:21.107743284 +0000 UTC m=+247.851791982" Apr 22 21:13:21.736727 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.736698 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:13:21.776370 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:21.776336 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:13:25.594772 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:25.594733 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:13:25.597174 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:25.597135 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f-metrics-certs\") pod \"network-metrics-daemon-hptqt\" (UID: \"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f\") " pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:13:25.840814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:25.840774 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bw4nx\"" Apr 22 21:13:25.847204 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:25.847116 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hptqt" Apr 22 21:13:25.969839 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:25.969813 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hptqt"] Apr 22 21:13:25.972356 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:13:25.972326 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod605a4e19_b663_46e8_9fcc_e4bd8f2e9c4f.slice/crio-fec76a2af91bb636717c13701c19974d716f04e0216f5e515968f7196821e787 WatchSource:0}: Error finding container fec76a2af91bb636717c13701c19974d716f04e0216f5e515968f7196821e787: Status 404 returned error can't find the container with id fec76a2af91bb636717c13701c19974d716f04e0216f5e515968f7196821e787 Apr 22 21:13:26.750145 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:26.750096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hptqt" event={"ID":"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f","Type":"ContainerStarted","Data":"fec76a2af91bb636717c13701c19974d716f04e0216f5e515968f7196821e787"} Apr 22 21:13:27.755745 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:27.755710 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hptqt" event={"ID":"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f","Type":"ContainerStarted","Data":"0bbd785895d17508c22b81fb04dd93468356e7f731ec454329362d0d44d89b3d"} Apr 22 21:13:27.755745 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:27.755751 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hptqt" event={"ID":"605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f","Type":"ContainerStarted","Data":"c24e8f9786dc819e305d5045e8de5f2acef99bc58f356767c99954f7613f1a1d"} Apr 22 21:13:27.771015 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:27.770957 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hptqt" podStartSLOduration=253.844520338 podStartE2EDuration="4m14.770940676s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:13:25.974237945 +0000 UTC m=+252.718286621" lastFinishedPulling="2026-04-22 21:13:26.900658271 +0000 UTC m=+253.644706959" observedRunningTime="2026-04-22 21:13:27.769051881 +0000 UTC m=+254.513100591" watchObservedRunningTime="2026-04-22 21:13:27.770940676 +0000 UTC m=+254.514989416" Apr 22 21:13:46.801887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:46.801825 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-d9c597cdd-6kmd2" podUID="c75f4221-900c-41b9-a2d4-08ffbe740edc" containerName="console" containerID="cri-o://797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318" gracePeriod=15 Apr 22 21:13:47.048422 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.048379 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9c597cdd-6kmd2_c75f4221-900c-41b9-a2d4-08ffbe740edc/console/0.log" Apr 22 21:13:47.048564 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.048455 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:13:47.081958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.081869 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.081958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.081915 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.081958 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.081943 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.082216 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.081975 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrr6r\" (UniqueName: \"kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.082216 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082022 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.082216 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082042 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.082469 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082446 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle\") pod \"c75f4221-900c-41b9-a2d4-08ffbe740edc\" (UID: \"c75f4221-900c-41b9-a2d4-08ffbe740edc\") " Apr 22 21:13:47.082553 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082479 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config" (OuterVolumeSpecName: "console-config") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:47.082642 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082617 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca" (OuterVolumeSpecName: "service-ca") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:47.082911 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082887 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:47.082961 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.082925 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:13:47.083087 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.083071 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-service-ca\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.083131 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.083096 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-trusted-ca-bundle\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.083131 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.083114 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.083199 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.083128 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c75f4221-900c-41b9-a2d4-08ffbe740edc-oauth-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.084264 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.084231 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:47.084264 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.084248 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:13:47.084597 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.084574 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r" (OuterVolumeSpecName: "kube-api-access-hrr6r") pod "c75f4221-900c-41b9-a2d4-08ffbe740edc" (UID: "c75f4221-900c-41b9-a2d4-08ffbe740edc"). InnerVolumeSpecName "kube-api-access-hrr6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:47.184534 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.184478 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.184534 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.184525 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c75f4221-900c-41b9-a2d4-08ffbe740edc-console-oauth-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.184534 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.184539 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hrr6r\" (UniqueName: \"kubernetes.io/projected/c75f4221-900c-41b9-a2d4-08ffbe740edc-kube-api-access-hrr6r\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:13:47.821177 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821140 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d9c597cdd-6kmd2_c75f4221-900c-41b9-a2d4-08ffbe740edc/console/0.log" Apr 22 21:13:47.821606 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821191 2568 generic.go:358] "Generic (PLEG): container finished" podID="c75f4221-900c-41b9-a2d4-08ffbe740edc" containerID="797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318" exitCode=2 Apr 22 21:13:47.821606 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821225 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9c597cdd-6kmd2" event={"ID":"c75f4221-900c-41b9-a2d4-08ffbe740edc","Type":"ContainerDied","Data":"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318"} Apr 22 21:13:47.821606 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821268 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d9c597cdd-6kmd2" event={"ID":"c75f4221-900c-41b9-a2d4-08ffbe740edc","Type":"ContainerDied","Data":"b86d1ebfc12c1033a25a19e44113e0f939addb93f297ede8e2c11333a44e4f63"} Apr 22 21:13:47.821606 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821281 2568 scope.go:117] "RemoveContainer" containerID="797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318" Apr 22 21:13:47.821606 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.821295 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d9c597cdd-6kmd2" Apr 22 21:13:47.829567 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.829546 2568 scope.go:117] "RemoveContainer" containerID="797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318" Apr 22 21:13:47.829880 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:13:47.829856 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318\": container with ID starting with 797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318 not found: ID does not exist" containerID="797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318" Apr 22 21:13:47.829944 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.829887 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318"} err="failed to get container status \"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318\": rpc error: code = NotFound desc = could not find container \"797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318\": container with ID starting with 797fe9ef658769bd7f757b242dcae97986b8e3fb4e6a7c1a0d73039ca2dad318 not found: ID does not exist" Apr 22 21:13:47.841788 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.841750 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:13:47.847265 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:47.847236 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-d9c597cdd-6kmd2"] Apr 22 21:13:49.838979 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:13:49.838946 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c75f4221-900c-41b9-a2d4-08ffbe740edc" path="/var/lib/kubelet/pods/c75f4221-900c-41b9-a2d4-08ffbe740edc/volumes" Apr 22 21:14:13.719494 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:14:13.719464 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:14:13.720034 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:14:13.719468 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:14:13.723000 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:14:13.722979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:14:13.723128 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:14:13.722979 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:14:13.729770 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:14:13.729752 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 21:15:06.676748 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.676670 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:15:06.677273 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.677055 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c75f4221-900c-41b9-a2d4-08ffbe740edc" containerName="console" Apr 22 21:15:06.677273 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.677069 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c75f4221-900c-41b9-a2d4-08ffbe740edc" containerName="console" Apr 22 21:15:06.677273 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.677144 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c75f4221-900c-41b9-a2d4-08ffbe740edc" containerName="console" Apr 22 21:15:06.679234 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.679179 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.689727 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.689700 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:15:06.760128 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760083 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpv2z\" (UniqueName: \"kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760207 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760449 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760449 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.760449 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.760374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861345 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861308 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861554 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861362 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpv2z\" (UniqueName: \"kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861554 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861411 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861580 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.861826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.861717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.862227 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.862202 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.862357 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.862229 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.862440 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.862374 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.862529 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.862505 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.864071 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.864051 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.864255 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.864235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.869235 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.869205 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpv2z\" (UniqueName: \"kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z\") pod \"console-68c7f7cc97-b79qb\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:06.988713 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:06.988630 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:07.111807 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:07.111778 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:15:07.114456 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:15:07.114372 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbf1819_6d53_4d45_8afb_fcbb876ef229.slice/crio-938b709fd3146b52eea9d0e0dd8bb209f2228dd831d53c02563d674d2d1397ce WatchSource:0}: Error finding container 938b709fd3146b52eea9d0e0dd8bb209f2228dd831d53c02563d674d2d1397ce: Status 404 returned error can't find the container with id 938b709fd3146b52eea9d0e0dd8bb209f2228dd831d53c02563d674d2d1397ce Apr 22 21:15:07.116127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:07.116110 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:15:08.074133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:08.074096 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c7f7cc97-b79qb" event={"ID":"adbf1819-6d53-4d45-8afb-fcbb876ef229","Type":"ContainerStarted","Data":"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61"} Apr 22 21:15:08.074133 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:08.074134 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c7f7cc97-b79qb" event={"ID":"adbf1819-6d53-4d45-8afb-fcbb876ef229","Type":"ContainerStarted","Data":"938b709fd3146b52eea9d0e0dd8bb209f2228dd831d53c02563d674d2d1397ce"} Apr 22 21:15:08.089985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:08.089932 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68c7f7cc97-b79qb" podStartSLOduration=2.089918435 podStartE2EDuration="2.089918435s" podCreationTimestamp="2026-04-22 21:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:15:08.088166021 +0000 UTC m=+354.832214732" watchObservedRunningTime="2026-04-22 21:15:08.089918435 +0000 UTC m=+354.833967134" Apr 22 21:15:16.989100 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:16.989065 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:16.989494 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:16.989130 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:16.993799 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:16.993775 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:17.106343 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:17.106317 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:15:17.140466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:17.140426 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:15:42.161156 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.161108 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-cdd5b779d-sqbnw" podUID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" containerName="console" containerID="cri-o://d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6" gracePeriod=15 Apr 22 21:15:42.405309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.405286 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdd5b779d-sqbnw_f0f52de9-0b0f-4870-a2d8-42ae69b96e3d/console/0.log" Apr 22 21:15:42.405475 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.405347 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:15:42.485126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485034 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485092 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485126 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485121 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485145 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8mql\" (UniqueName: \"kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485193 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485219 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485245 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca\") pod \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\" (UID: \"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d\") " Apr 22 21:15:42.485668 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485591 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:42.485726 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485700 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:42.485778 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.485727 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca" (OuterVolumeSpecName: "service-ca") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:42.486321 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.486298 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config" (OuterVolumeSpecName: "console-config") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:15:42.487547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.487526 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql" (OuterVolumeSpecName: "kube-api-access-k8mql") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "kube-api-access-k8mql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:42.487808 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.487776 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:15:42.487808 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.487789 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" (UID: "f0f52de9-0b0f-4870-a2d8-42ae69b96e3d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:15:42.586453 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586422 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-trusted-ca-bundle\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586453 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586450 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586453 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586461 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-service-ca\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586471 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-oauth-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586480 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-oauth-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586491 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-console-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.586708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:42.586500 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8mql\" (UniqueName: \"kubernetes.io/projected/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d-kube-api-access-k8mql\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:15:43.180429 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180403 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-cdd5b779d-sqbnw_f0f52de9-0b0f-4870-a2d8-42ae69b96e3d/console/0.log" Apr 22 21:15:43.180814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180446 2568 generic.go:358] "Generic (PLEG): container finished" podID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" containerID="d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6" exitCode=2 Apr 22 21:15:43.180814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180516 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-cdd5b779d-sqbnw" Apr 22 21:15:43.180814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180535 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdd5b779d-sqbnw" event={"ID":"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d","Type":"ContainerDied","Data":"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6"} Apr 22 21:15:43.180814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180574 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-cdd5b779d-sqbnw" event={"ID":"f0f52de9-0b0f-4870-a2d8-42ae69b96e3d","Type":"ContainerDied","Data":"ab40eca6cfcf4ab8b5ba052773c0aa87e7cb65f371a3fbe1200373129cdc916b"} Apr 22 21:15:43.180814 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.180588 2568 scope.go:117] "RemoveContainer" containerID="d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6" Apr 22 21:15:43.189413 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.189376 2568 scope.go:117] "RemoveContainer" containerID="d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6" Apr 22 21:15:43.189671 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:15:43.189648 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6\": container with ID starting with d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6 not found: ID does not exist" containerID="d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6" Apr 22 21:15:43.189729 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.189679 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6"} err="failed to get container status \"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6\": rpc error: code = NotFound desc = could not find container \"d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6\": container with ID starting with d9f784c5f0fbb7148068ed4db11422fe05817c622528e00a4bdb564d0a751ba6 not found: ID does not exist" Apr 22 21:15:43.200925 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.200892 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:15:43.204462 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.204435 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-cdd5b779d-sqbnw"] Apr 22 21:15:43.839864 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:43.839831 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" path="/var/lib/kubelet/pods/f0f52de9-0b0f-4870-a2d8-42ae69b96e3d/volumes" Apr 22 21:15:47.691498 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.691461 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv"] Apr 22 21:15:47.691901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.691886 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" containerName="console" Apr 22 21:15:47.691946 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.691904 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" containerName="console" Apr 22 21:15:47.691980 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.691965 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0f52de9-0b0f-4870-a2d8-42ae69b96e3d" containerName="console" Apr 22 21:15:47.696208 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.696191 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.698702 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.698679 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 21:15:47.698827 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.698715 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:15:47.698827 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.698724 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-fd85b\"" Apr 22 21:15:47.706314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.706290 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv"] Apr 22 21:15:47.735848 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.735808 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.736033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.735867 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp98n\" (UniqueName: \"kubernetes.io/projected/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-kube-api-access-jp98n\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.836871 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.836831 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp98n\" (UniqueName: \"kubernetes.io/projected/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-kube-api-access-jp98n\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.837054 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.836963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.837384 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.837363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:47.849716 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:47.849689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp98n\" (UniqueName: \"kubernetes.io/projected/c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7-kube-api-access-jp98n\") pod \"cert-manager-operator-controller-manager-54b9655956-9wljv\" (UID: \"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:48.006183 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:48.006101 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" Apr 22 21:15:48.145118 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:48.145088 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv"] Apr 22 21:15:48.147904 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:15:48.147879 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc04ae6b5_5b2b_4d6f_b044_dfbcbfcfcab7.slice/crio-dcf320904e69a4723070df7e5b7120553d1c297e220b5f496aef241c77381d9e WatchSource:0}: Error finding container dcf320904e69a4723070df7e5b7120553d1c297e220b5f496aef241c77381d9e: Status 404 returned error can't find the container with id dcf320904e69a4723070df7e5b7120553d1c297e220b5f496aef241c77381d9e Apr 22 21:15:48.203376 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:48.203344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" event={"ID":"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7","Type":"ContainerStarted","Data":"dcf320904e69a4723070df7e5b7120553d1c297e220b5f496aef241c77381d9e"} Apr 22 21:15:51.222543 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:51.222496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" event={"ID":"c04ae6b5-5b2b-4d6f-b044-dfbcbfcfcab7","Type":"ContainerStarted","Data":"a3402a042c99c4591e2faa1fa8f7b483307944d608b2cd0f97baa1bba233cc23"} Apr 22 21:15:51.242242 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:51.241701 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-9wljv" podStartSLOduration=1.253456744 podStartE2EDuration="4.24168194s" podCreationTimestamp="2026-04-22 21:15:47 +0000 UTC" firstStartedPulling="2026-04-22 21:15:48.15050957 +0000 UTC m=+394.894558250" lastFinishedPulling="2026-04-22 21:15:51.138734756 +0000 UTC m=+397.882783446" observedRunningTime="2026-04-22 21:15:51.239622831 +0000 UTC m=+397.983671586" watchObservedRunningTime="2026-04-22 21:15:51.24168194 +0000 UTC m=+397.985730642" Apr 22 21:15:53.665803 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.665765 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hx6w8"] Apr 22 21:15:53.669197 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.669178 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.671497 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.671477 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 21:15:53.672448 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.672430 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 21:15:53.672506 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.672448 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-8x7sk\"" Apr 22 21:15:53.677799 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.677764 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hx6w8"] Apr 22 21:15:53.788520 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.788483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.788672 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.788626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgw4s\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-kube-api-access-dgw4s\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.889593 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.889559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgw4s\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-kube-api-access-dgw4s\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.889864 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.889842 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.897072 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.897047 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.897222 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.897192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgw4s\" (UniqueName: \"kubernetes.io/projected/11ba074d-6523-4ff2-b8d3-a1685d394fdf-kube-api-access-dgw4s\") pod \"cert-manager-webhook-587ccfb98-hx6w8\" (UID: \"11ba074d-6523-4ff2-b8d3-a1685d394fdf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:53.987881 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:53.987805 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:54.109673 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:54.109646 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-hx6w8"] Apr 22 21:15:54.112351 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:15:54.112319 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11ba074d_6523_4ff2_b8d3_a1685d394fdf.slice/crio-66f2bd32dd82320ea9162015615db4eb589e531b09ddf4120975ae5256549d64 WatchSource:0}: Error finding container 66f2bd32dd82320ea9162015615db4eb589e531b09ddf4120975ae5256549d64: Status 404 returned error can't find the container with id 66f2bd32dd82320ea9162015615db4eb589e531b09ddf4120975ae5256549d64 Apr 22 21:15:54.233453 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:54.233414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" event={"ID":"11ba074d-6523-4ff2-b8d3-a1685d394fdf","Type":"ContainerStarted","Data":"66f2bd32dd82320ea9162015615db4eb589e531b09ddf4120975ae5256549d64"} Apr 22 21:15:59.253068 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:59.253023 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" event={"ID":"11ba074d-6523-4ff2-b8d3-a1685d394fdf","Type":"ContainerStarted","Data":"9916dbd084e335b0eb524ec000cc863e73749b5086e7fbb4d25251ab9f8fb75d"} Apr 22 21:15:59.253557 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:59.253109 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:15:59.272420 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:15:59.272338 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" podStartSLOduration=1.330591864 podStartE2EDuration="6.272322818s" podCreationTimestamp="2026-04-22 21:15:53 +0000 UTC" firstStartedPulling="2026-04-22 21:15:54.114269466 +0000 UTC m=+400.858318144" lastFinishedPulling="2026-04-22 21:15:59.056000417 +0000 UTC m=+405.800049098" observedRunningTime="2026-04-22 21:15:59.269121555 +0000 UTC m=+406.013170258" watchObservedRunningTime="2026-04-22 21:15:59.272322818 +0000 UTC m=+406.016371517" Apr 22 21:16:02.621092 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.621054 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw"] Apr 22 21:16:02.624612 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.624597 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.627032 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.627011 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:16:02.628016 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.627999 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-xb6hx\"" Apr 22 21:16:02.628073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.628032 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 21:16:02.633051 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.633028 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw"] Apr 22 21:16:02.769971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.769934 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9rq\" (UniqueName: \"kubernetes.io/projected/7925b8e2-1630-4189-9f93-745459937e16-kube-api-access-7r9rq\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.770143 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.769989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7925b8e2-1630-4189-9f93-745459937e16-tmp\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.870634 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.870578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9rq\" (UniqueName: \"kubernetes.io/projected/7925b8e2-1630-4189-9f93-745459937e16-kube-api-access-7r9rq\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.870824 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.870655 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7925b8e2-1630-4189-9f93-745459937e16-tmp\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.871012 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.870995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7925b8e2-1630-4189-9f93-745459937e16-tmp\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.878082 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.878009 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9rq\" (UniqueName: \"kubernetes.io/projected/7925b8e2-1630-4189-9f93-745459937e16-kube-api-access-7r9rq\") pod \"openshift-lws-operator-bfc7f696d-cmtnw\" (UID: \"7925b8e2-1630-4189-9f93-745459937e16\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:02.934766 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:02.934724 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" Apr 22 21:16:03.061159 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:03.061126 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw"] Apr 22 21:16:03.063814 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:16:03.063779 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7925b8e2_1630_4189_9f93_745459937e16.slice/crio-987073d4378b61f20ee7dd14afb3051c298a8947957c5b592c2fe51d18bbb71b WatchSource:0}: Error finding container 987073d4378b61f20ee7dd14afb3051c298a8947957c5b592c2fe51d18bbb71b: Status 404 returned error can't find the container with id 987073d4378b61f20ee7dd14afb3051c298a8947957c5b592c2fe51d18bbb71b Apr 22 21:16:03.267571 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:03.267490 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" event={"ID":"7925b8e2-1630-4189-9f93-745459937e16","Type":"ContainerStarted","Data":"987073d4378b61f20ee7dd14afb3051c298a8947957c5b592c2fe51d18bbb71b"} Apr 22 21:16:05.258648 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:05.258616 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-hx6w8" Apr 22 21:16:06.281113 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:06.281024 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" event={"ID":"7925b8e2-1630-4189-9f93-745459937e16","Type":"ContainerStarted","Data":"d2c9e532f22cb9077206f20a58c8db60604eda93757d8eadc78d6e69d4a8925d"} Apr 22 21:16:06.297056 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:06.297002 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-cmtnw" podStartSLOduration=1.493663776 podStartE2EDuration="4.296985673s" podCreationTimestamp="2026-04-22 21:16:02 +0000 UTC" firstStartedPulling="2026-04-22 21:16:03.065179366 +0000 UTC m=+409.809228043" lastFinishedPulling="2026-04-22 21:16:05.86850126 +0000 UTC m=+412.612549940" observedRunningTime="2026-04-22 21:16:06.29588875 +0000 UTC m=+413.039937470" watchObservedRunningTime="2026-04-22 21:16:06.296985673 +0000 UTC m=+413.041034371" Apr 22 21:16:26.748918 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.748880 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk"] Apr 22 21:16:26.760221 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.760194 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.763319 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.763293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 21:16:26.763494 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.763474 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 21:16:26.763724 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.763707 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-s8gxl\"" Apr 22 21:16:26.763831 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.763815 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 21:16:26.763886 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.763875 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 21:16:26.769471 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.769447 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk"] Apr 22 21:16:26.776766 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.776738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.776902 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.776770 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.776902 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.776798 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99x25\" (UniqueName: \"kubernetes.io/projected/948cd868-138f-4d97-8db2-f99ecb3f2f0b-kube-api-access-99x25\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.877640 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.877595 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.877860 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.877648 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.877860 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.877790 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99x25\" (UniqueName: \"kubernetes.io/projected/948cd868-138f-4d97-8db2-f99ecb3f2f0b-kube-api-access-99x25\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.880080 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.880050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.880185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.880101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/948cd868-138f-4d97-8db2-f99ecb3f2f0b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:26.885367 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:26.885344 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99x25\" (UniqueName: \"kubernetes.io/projected/948cd868-138f-4d97-8db2-f99ecb3f2f0b-kube-api-access-99x25\") pod \"opendatahub-operator-controller-manager-65d8664856-rdsxk\" (UID: \"948cd868-138f-4d97-8db2-f99ecb3f2f0b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:27.072312 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:27.072199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:27.202192 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:27.202149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk"] Apr 22 21:16:27.206070 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:16:27.206040 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod948cd868_138f_4d97_8db2_f99ecb3f2f0b.slice/crio-2ff9a9f91c22a042d0cf213261e49c51628de6887f1f7f6dffef94ef20ebbec6 WatchSource:0}: Error finding container 2ff9a9f91c22a042d0cf213261e49c51628de6887f1f7f6dffef94ef20ebbec6: Status 404 returned error can't find the container with id 2ff9a9f91c22a042d0cf213261e49c51628de6887f1f7f6dffef94ef20ebbec6 Apr 22 21:16:27.354103 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:27.354065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" event={"ID":"948cd868-138f-4d97-8db2-f99ecb3f2f0b","Type":"ContainerStarted","Data":"2ff9a9f91c22a042d0cf213261e49c51628de6887f1f7f6dffef94ef20ebbec6"} Apr 22 21:16:30.368178 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:30.368142 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" event={"ID":"948cd868-138f-4d97-8db2-f99ecb3f2f0b","Type":"ContainerStarted","Data":"f9076acdb783f762fc930344cdbc722949b4890344126b1b03584d2323cd506c"} Apr 22 21:16:30.368579 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:30.368252 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:30.393566 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:30.393505 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" podStartSLOduration=1.752295303 podStartE2EDuration="4.393488531s" podCreationTimestamp="2026-04-22 21:16:26 +0000 UTC" firstStartedPulling="2026-04-22 21:16:27.208127402 +0000 UTC m=+433.952176079" lastFinishedPulling="2026-04-22 21:16:29.849320625 +0000 UTC m=+436.593369307" observedRunningTime="2026-04-22 21:16:30.390519152 +0000 UTC m=+437.134567854" watchObservedRunningTime="2026-04-22 21:16:30.393488531 +0000 UTC m=+437.137537231" Apr 22 21:16:33.002420 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.002368 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6"] Apr 22 21:16:33.005887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.005858 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.008693 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.008670 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 21:16:33.009891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.009867 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 21:16:33.010017 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.009930 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 21:16:33.010017 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.009873 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-r8sgt\"" Apr 22 21:16:33.012467 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.012444 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6"] Apr 22 21:16:33.037658 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.037617 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj7zc\" (UniqueName: \"kubernetes.io/projected/b84866af-bef1-4d0c-aa5c-ff1de43156ca-kube-api-access-zj7zc\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.037831 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.037685 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.037831 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.037711 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-metrics-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.037831 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.037745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b84866af-bef1-4d0c-aa5c-ff1de43156ca-manager-config\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.138145 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.138108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.138145 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.138143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-metrics-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.138353 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.138173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b84866af-bef1-4d0c-aa5c-ff1de43156ca-manager-config\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.138353 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.138233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj7zc\" (UniqueName: \"kubernetes.io/projected/b84866af-bef1-4d0c-aa5c-ff1de43156ca-kube-api-access-zj7zc\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.138950 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.138914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b84866af-bef1-4d0c-aa5c-ff1de43156ca-manager-config\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.140679 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.140661 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-metrics-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.140772 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.140737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b84866af-bef1-4d0c-aa5c-ff1de43156ca-cert\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.146073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.146049 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj7zc\" (UniqueName: \"kubernetes.io/projected/b84866af-bef1-4d0c-aa5c-ff1de43156ca-kube-api-access-zj7zc\") pod \"lws-controller-manager-5db7bf5949-sqdz6\" (UID: \"b84866af-bef1-4d0c-aa5c-ff1de43156ca\") " pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.317288 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.317191 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:33.450241 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:33.450123 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6"] Apr 22 21:16:34.384853 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:34.384807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" event={"ID":"b84866af-bef1-4d0c-aa5c-ff1de43156ca","Type":"ContainerStarted","Data":"d123119ba98a5d7f192309dd6e7ee3d1237a657efb56934366941e41e4a51d9b"} Apr 22 21:16:35.389891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:35.389845 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" event={"ID":"b84866af-bef1-4d0c-aa5c-ff1de43156ca","Type":"ContainerStarted","Data":"779f50fabe0154e0cbbb0083f9c50f0cadd3af3bac31ad007fd58e9f3ae6890a"} Apr 22 21:16:35.389891 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:35.389894 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:35.406723 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:35.406668 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" podStartSLOduration=1.977205793 podStartE2EDuration="3.406652474s" podCreationTimestamp="2026-04-22 21:16:32 +0000 UTC" firstStartedPulling="2026-04-22 21:16:33.450595529 +0000 UTC m=+440.194644213" lastFinishedPulling="2026-04-22 21:16:34.880042217 +0000 UTC m=+441.624090894" observedRunningTime="2026-04-22 21:16:35.405523972 +0000 UTC m=+442.149572671" watchObservedRunningTime="2026-04-22 21:16:35.406652474 +0000 UTC m=+442.150701172" Apr 22 21:16:41.374301 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:41.374262 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-rdsxk" Apr 22 21:16:46.396127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:46.396096 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5db7bf5949-sqdz6" Apr 22 21:16:55.474723 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.474678 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78"] Apr 22 21:16:55.478403 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.478361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.480912 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.480885 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 21:16:55.480912 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.480897 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 21:16:55.480912 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.480894 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-l964r\"" Apr 22 21:16:55.481982 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.481961 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 21:16:55.481982 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.481974 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 21:16:55.487424 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.487375 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78"] Apr 22 21:16:55.649964 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.649926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.650147 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.649995 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndjp\" (UniqueName: \"kubernetes.io/projected/a49e7b51-7394-4b9e-aac2-fc8c2586780f-kube-api-access-fndjp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.650147 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.650025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.752232 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.751562 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fndjp\" (UniqueName: \"kubernetes.io/projected/a49e7b51-7394-4b9e-aac2-fc8c2586780f-kube-api-access-fndjp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.752232 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.751629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.752232 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.751713 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.753728 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:16:55.753700 2568 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 22 21:16:55.754009 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:16:55.753978 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs podName:a49e7b51-7394-4b9e-aac2-fc8c2586780f nodeName:}" failed. No retries permitted until 2026-04-22 21:16:56.253947316 +0000 UTC m=+462.997996016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs") pod "kube-auth-proxy-7b8c5f7f67-9jf78" (UID: "a49e7b51-7394-4b9e-aac2-fc8c2586780f") : secret "kube-auth-proxy-tls" not found Apr 22 21:16:55.754819 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.754797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:55.764629 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:55.764603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndjp\" (UniqueName: \"kubernetes.io/projected/a49e7b51-7394-4b9e-aac2-fc8c2586780f-kube-api-access-fndjp\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:56.255810 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:56.255760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:56.258313 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:56.258290 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a49e7b51-7394-4b9e-aac2-fc8c2586780f-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-9jf78\" (UID: \"a49e7b51-7394-4b9e-aac2-fc8c2586780f\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:56.389325 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:56.389289 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" Apr 22 21:16:56.515057 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:56.515027 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78"] Apr 22 21:16:56.517769 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:16:56.517738 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49e7b51_7394_4b9e_aac2_fc8c2586780f.slice/crio-d1b768530b08b0ee27020ca0e66ed3a4266b756f0cba8b5ea8357b49c51d992c WatchSource:0}: Error finding container d1b768530b08b0ee27020ca0e66ed3a4266b756f0cba8b5ea8357b49c51d992c: Status 404 returned error can't find the container with id d1b768530b08b0ee27020ca0e66ed3a4266b756f0cba8b5ea8357b49c51d992c Apr 22 21:16:57.478774 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:16:57.478731 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" event={"ID":"a49e7b51-7394-4b9e-aac2-fc8c2586780f","Type":"ContainerStarted","Data":"d1b768530b08b0ee27020ca0e66ed3a4266b756f0cba8b5ea8357b49c51d992c"} Apr 22 21:17:00.490949 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:17:00.490910 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" event={"ID":"a49e7b51-7394-4b9e-aac2-fc8c2586780f","Type":"ContainerStarted","Data":"ed303b0bfea8ceae31d528c9ca43c8717ef61eb6c70d0e12958d1f75b7f22227"} Apr 22 21:17:00.507905 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:17:00.507851 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-9jf78" podStartSLOduration=2.489162264 podStartE2EDuration="5.507835606s" podCreationTimestamp="2026-04-22 21:16:55 +0000 UTC" firstStartedPulling="2026-04-22 21:16:56.519486201 +0000 UTC m=+463.263534878" lastFinishedPulling="2026-04-22 21:16:59.538159532 +0000 UTC m=+466.282208220" observedRunningTime="2026-04-22 21:17:00.506451564 +0000 UTC m=+467.250500262" watchObservedRunningTime="2026-04-22 21:17:00.507835606 +0000 UTC m=+467.251884363" Apr 22 21:18:13.261735 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.261649 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd9b9bfcc-pnhzv"] Apr 22 21:18:13.264379 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.264350 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.279116 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.279081 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd9b9bfcc-pnhzv"] Apr 22 21:18:13.341973 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.341929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-oauth-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342146 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.341994 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxzft\" (UniqueName: \"kubernetes.io/projected/911b8efe-a3a8-4725-b945-cd5e0976f559-kube-api-access-zxzft\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342146 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.342019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342146 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.342122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-oauth-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342253 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.342151 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-trusted-ca-bundle\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342253 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.342177 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-service-ca\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.342315 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.342269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-console-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443108 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-console-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443306 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-oauth-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443306 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443155 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxzft\" (UniqueName: \"kubernetes.io/projected/911b8efe-a3a8-4725-b945-cd5e0976f559-kube-api-access-zxzft\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443306 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443174 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443306 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443229 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-oauth-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443569 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-trusted-ca-bundle\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443569 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-service-ca\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.443935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.443912 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-console-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.444150 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.444124 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-oauth-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.444192 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.444162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-service-ca\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.444229 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.444207 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/911b8efe-a3a8-4725-b945-cd5e0976f559-trusted-ca-bundle\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.445816 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.445788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-oauth-config\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.445978 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.445868 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/911b8efe-a3a8-4725-b945-cd5e0976f559-console-serving-cert\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.454000 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.453974 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxzft\" (UniqueName: \"kubernetes.io/projected/911b8efe-a3a8-4725-b945-cd5e0976f559-kube-api-access-zxzft\") pod \"console-7cd9b9bfcc-pnhzv\" (UID: \"911b8efe-a3a8-4725-b945-cd5e0976f559\") " pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.574473 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.574327 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:13.707311 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.707287 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd9b9bfcc-pnhzv"] Apr 22 21:18:13.710331 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:18:13.710303 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911b8efe_a3a8_4725_b945_cd5e0976f559.slice/crio-d13409cdf98435db123d53664e0e89a3ad49e1283d35d8816b8e8c8d51c0e373 WatchSource:0}: Error finding container d13409cdf98435db123d53664e0e89a3ad49e1283d35d8816b8e8c8d51c0e373: Status 404 returned error can't find the container with id d13409cdf98435db123d53664e0e89a3ad49e1283d35d8816b8e8c8d51c0e373 Apr 22 21:18:13.752305 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:13.752276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd9b9bfcc-pnhzv" event={"ID":"911b8efe-a3a8-4725-b945-cd5e0976f559","Type":"ContainerStarted","Data":"d13409cdf98435db123d53664e0e89a3ad49e1283d35d8816b8e8c8d51c0e373"} Apr 22 21:18:14.757035 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:14.756995 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd9b9bfcc-pnhzv" event={"ID":"911b8efe-a3a8-4725-b945-cd5e0976f559","Type":"ContainerStarted","Data":"7b1861d7092ebcfa861f209fd735c5209707a9e1565d5d773b5e310587a9a832"} Apr 22 21:18:14.775759 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:14.775703 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd9b9bfcc-pnhzv" podStartSLOduration=1.775687002 podStartE2EDuration="1.775687002s" podCreationTimestamp="2026-04-22 21:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:18:14.773516985 +0000 UTC m=+541.517565696" watchObservedRunningTime="2026-04-22 21:18:14.775687002 +0000 UTC m=+541.519735701" Apr 22 21:18:23.575099 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:23.575043 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:23.575099 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:23.575100 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:23.580111 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:23.580081 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:23.791262 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:23.791232 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd9b9bfcc-pnhzv" Apr 22 21:18:23.849497 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:23.849469 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:18:48.870851 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:48.870778 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68c7f7cc97-b79qb" podUID="adbf1819-6d53-4d45-8afb-fcbb876ef229" containerName="console" containerID="cri-o://3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61" gracePeriod=15 Apr 22 21:18:49.120654 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.120628 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68c7f7cc97-b79qb_adbf1819-6d53-4d45-8afb-fcbb876ef229/console/0.log" Apr 22 21:18:49.120794 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.120696 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:18:49.174791 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174697 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.174791 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174747 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.174791 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174772 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.175028 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174821 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.175028 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174849 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.175028 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174907 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.175028 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.174935 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpv2z\" (UniqueName: \"kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z\") pod \"adbf1819-6d53-4d45-8afb-fcbb876ef229\" (UID: \"adbf1819-6d53-4d45-8afb-fcbb876ef229\") " Apr 22 21:18:49.175298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.175273 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config" (OuterVolumeSpecName: "console-config") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:18:49.175381 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.175355 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca" (OuterVolumeSpecName: "service-ca") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:18:49.175467 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.175430 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:18:49.175531 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.175500 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:18:49.177029 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.177004 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:18:49.177140 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.177054 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:18:49.177140 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.177075 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z" (OuterVolumeSpecName: "kube-api-access-fpv2z") pod "adbf1819-6d53-4d45-8afb-fcbb876ef229" (UID: "adbf1819-6d53-4d45-8afb-fcbb876ef229"). InnerVolumeSpecName "kube-api-access-fpv2z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:49.275983 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.275945 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-oauth-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.275983 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.275977 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-config\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.275983 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.275986 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/adbf1819-6d53-4d45-8afb-fcbb876ef229-console-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.276212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.275995 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-service-ca\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.276212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.276005 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-trusted-ca-bundle\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.276212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.276016 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fpv2z\" (UniqueName: \"kubernetes.io/projected/adbf1819-6d53-4d45-8afb-fcbb876ef229-kube-api-access-fpv2z\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.276212 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.276024 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/adbf1819-6d53-4d45-8afb-fcbb876ef229-oauth-serving-cert\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:18:49.878898 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.878870 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68c7f7cc97-b79qb_adbf1819-6d53-4d45-8afb-fcbb876ef229/console/0.log" Apr 22 21:18:49.879332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.878912 2568 generic.go:358] "Generic (PLEG): container finished" podID="adbf1819-6d53-4d45-8afb-fcbb876ef229" containerID="3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61" exitCode=2 Apr 22 21:18:49.879332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.878968 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c7f7cc97-b79qb" event={"ID":"adbf1819-6d53-4d45-8afb-fcbb876ef229","Type":"ContainerDied","Data":"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61"} Apr 22 21:18:49.879332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.878983 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68c7f7cc97-b79qb" Apr 22 21:18:49.879332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.878997 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68c7f7cc97-b79qb" event={"ID":"adbf1819-6d53-4d45-8afb-fcbb876ef229","Type":"ContainerDied","Data":"938b709fd3146b52eea9d0e0dd8bb209f2228dd831d53c02563d674d2d1397ce"} Apr 22 21:18:49.879332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.879013 2568 scope.go:117] "RemoveContainer" containerID="3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61" Apr 22 21:18:49.887724 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.887700 2568 scope.go:117] "RemoveContainer" containerID="3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61" Apr 22 21:18:49.888017 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:18:49.887991 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61\": container with ID starting with 3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61 not found: ID does not exist" containerID="3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61" Apr 22 21:18:49.888085 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.888028 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61"} err="failed to get container status \"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61\": rpc error: code = NotFound desc = could not find container \"3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61\": container with ID starting with 3f66e29f19d866d546fbc04c97314a8c7b0f4dad1c6165093a8136b250a87f61 not found: ID does not exist" Apr 22 21:18:49.896659 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.896630 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:18:49.899838 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:49.899809 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68c7f7cc97-b79qb"] Apr 22 21:18:51.840136 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:51.840097 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbf1819-6d53-4d45-8afb-fcbb876ef229" path="/var/lib/kubelet/pods/adbf1819-6d53-4d45-8afb-fcbb876ef229/volumes" Apr 22 21:18:59.825833 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.825796 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv"] Apr 22 21:18:59.826320 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.826232 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adbf1819-6d53-4d45-8afb-fcbb876ef229" containerName="console" Apr 22 21:18:59.826320 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.826247 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbf1819-6d53-4d45-8afb-fcbb876ef229" containerName="console" Apr 22 21:18:59.826423 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.826322 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="adbf1819-6d53-4d45-8afb-fcbb876ef229" containerName="console" Apr 22 21:18:59.829356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.829340 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.831899 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.831879 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 21:18:59.832017 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.831941 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-wf2jn\"" Apr 22 21:18:59.852729 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.852686 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv"] Apr 22 21:18:59.854636 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.854603 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.854850 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.854834 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.854949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855129 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855020 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97nf\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-kube-api-access-b97nf\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855129 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855067 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855129 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855098 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855293 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855293 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/78d27d4c-e729-427f-a1aa-326c30f8fbab-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.855293 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.855263 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.955874 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.955843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956058 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.955880 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956058 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.955907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b97nf\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-kube-api-access-b97nf\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956058 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.955955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956058 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.955987 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956058 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956018 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/78d27d4c-e729-427f-a1aa-326c30f8fbab-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956278 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.956877 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956850 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.957010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/78d27d4c-e729-427f-a1aa-326c30f8fbab-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.957010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.956944 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.958548 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.958527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.958696 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.958682 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.965700 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.965677 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:18:59.965812 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:18:59.965751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97nf\" (UniqueName: \"kubernetes.io/projected/78d27d4c-e729-427f-a1aa-326c30f8fbab-kube-api-access-b97nf\") pod \"maas-default-gateway-openshift-default-845c6b4b48-b2wxv\" (UID: \"78d27d4c-e729-427f-a1aa-326c30f8fbab\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:00.143740 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:00.143694 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:00.270013 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:00.269984 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv"] Apr 22 21:19:00.272286 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:19:00.272253 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d27d4c_e729_427f_a1aa_326c30f8fbab.slice/crio-42d7084257c1341d717b523b262745d890364c4891aa91d4bb921f93bc91cbf8 WatchSource:0}: Error finding container 42d7084257c1341d717b523b262745d890364c4891aa91d4bb921f93bc91cbf8: Status 404 returned error can't find the container with id 42d7084257c1341d717b523b262745d890364c4891aa91d4bb921f93bc91cbf8 Apr 22 21:19:00.917623 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:00.917586 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" event={"ID":"78d27d4c-e729-427f-a1aa-326c30f8fbab","Type":"ContainerStarted","Data":"42d7084257c1341d717b523b262745d890364c4891aa91d4bb921f93bc91cbf8"} Apr 22 21:19:02.827329 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:02.827284 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 22 21:19:02.827637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:02.827364 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 22 21:19:02.827637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:02.827403 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 22 21:19:02.928507 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:02.928468 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" event={"ID":"78d27d4c-e729-427f-a1aa-326c30f8fbab","Type":"ContainerStarted","Data":"3491e887df5ed40e8f7160c901e3532362d99e33b06d73661f59ef224fd03354"} Apr 22 21:19:02.947644 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:02.947584 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" podStartSLOduration=1.395083109 podStartE2EDuration="3.947566042s" podCreationTimestamp="2026-04-22 21:18:59 +0000 UTC" firstStartedPulling="2026-04-22 21:19:00.274507485 +0000 UTC m=+587.018556165" lastFinishedPulling="2026-04-22 21:19:02.826990418 +0000 UTC m=+589.571039098" observedRunningTime="2026-04-22 21:19:02.944580487 +0000 UTC m=+589.688629183" watchObservedRunningTime="2026-04-22 21:19:02.947566042 +0000 UTC m=+589.691614742" Apr 22 21:19:03.144492 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:03.144451 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:03.150083 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:03.150051 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:03.932540 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:03.932505 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:03.933514 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:03.933493 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-b2wxv" Apr 22 21:19:04.057447 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.057403 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:19:04.063119 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.063092 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.064446 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.064418 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:19:04.065562 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.065522 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:19:04.065692 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.065611 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 21:19:04.065761 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.065736 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5sldk\"" Apr 22 21:19:04.066096 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.066071 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:19:04.091988 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.091952 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:19:04.095760 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.095732 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38f62bb4-aaae-453a-8592-dfafb12cc5f1-config-file\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.095935 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.095850 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5rv\" (UniqueName: \"kubernetes.io/projected/38f62bb4-aaae-453a-8592-dfafb12cc5f1-kube-api-access-vw5rv\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.197065 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.196982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5rv\" (UniqueName: \"kubernetes.io/projected/38f62bb4-aaae-453a-8592-dfafb12cc5f1-kube-api-access-vw5rv\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.197211 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.197070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38f62bb4-aaae-453a-8592-dfafb12cc5f1-config-file\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.197677 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.197653 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/38f62bb4-aaae-453a-8592-dfafb12cc5f1-config-file\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.205448 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.205418 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5rv\" (UniqueName: \"kubernetes.io/projected/38f62bb4-aaae-453a-8592-dfafb12cc5f1-kube-api-access-vw5rv\") pod \"limitador-limitador-78c99df468-lj74w\" (UID: \"38f62bb4-aaae-453a-8592-dfafb12cc5f1\") " pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.374508 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.374467 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:04.499033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.498996 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:19:04.502811 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:19:04.502783 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38f62bb4_aaae_453a_8592_dfafb12cc5f1.slice/crio-6df0f6ea764092c08679a6475375361033c58ce5fa003412541e346d2e1e027d WatchSource:0}: Error finding container 6df0f6ea764092c08679a6475375361033c58ce5fa003412541e346d2e1e027d: Status 404 returned error can't find the container with id 6df0f6ea764092c08679a6475375361033c58ce5fa003412541e346d2e1e027d Apr 22 21:19:04.542512 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.542476 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:04.547555 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.547533 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:04.549858 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.549839 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q9rz8\"" Apr 22 21:19:04.553672 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.553642 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:04.600774 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.600736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4kkc\" (UniqueName: \"kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc\") pod \"authorino-f99f4b5cd-jks5n\" (UID: \"e7491713-c4e1-4e01-9552-f891ab49fb72\") " pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:04.701610 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.701571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4kkc\" (UniqueName: \"kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc\") pod \"authorino-f99f4b5cd-jks5n\" (UID: \"e7491713-c4e1-4e01-9552-f891ab49fb72\") " pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:04.709308 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.709231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4kkc\" (UniqueName: \"kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc\") pod \"authorino-f99f4b5cd-jks5n\" (UID: \"e7491713-c4e1-4e01-9552-f891ab49fb72\") " pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:04.858708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.858652 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:04.939598 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:04.939414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" event={"ID":"38f62bb4-aaae-453a-8592-dfafb12cc5f1","Type":"ContainerStarted","Data":"6df0f6ea764092c08679a6475375361033c58ce5fa003412541e346d2e1e027d"} Apr 22 21:19:05.020120 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:05.020090 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:05.022690 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:19:05.022659 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7491713_c4e1_4e01_9552_f891ab49fb72.slice/crio-2fba1ea5ad5eddc1ef6fa4809103aa8150f2d787cc658ab9719b0ee31fe62b55 WatchSource:0}: Error finding container 2fba1ea5ad5eddc1ef6fa4809103aa8150f2d787cc658ab9719b0ee31fe62b55: Status 404 returned error can't find the container with id 2fba1ea5ad5eddc1ef6fa4809103aa8150f2d787cc658ab9719b0ee31fe62b55 Apr 22 21:19:05.947004 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:05.946954 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" event={"ID":"e7491713-c4e1-4e01-9552-f891ab49fb72","Type":"ContainerStarted","Data":"2fba1ea5ad5eddc1ef6fa4809103aa8150f2d787cc658ab9719b0ee31fe62b55"} Apr 22 21:19:09.803903 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.803869 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:09.964319 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.964276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" event={"ID":"38f62bb4-aaae-453a-8592-dfafb12cc5f1","Type":"ContainerStarted","Data":"dc9387a0d77c32d33437f21bdd710d6c6073080036ec33d66c34e52b8d272031"} Apr 22 21:19:09.964547 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.964530 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:09.965609 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.965585 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" event={"ID":"e7491713-c4e1-4e01-9552-f891ab49fb72","Type":"ContainerStarted","Data":"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690"} Apr 22 21:19:09.985699 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.985651 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" podStartSLOduration=1.5710254670000001 podStartE2EDuration="5.985636818s" podCreationTimestamp="2026-04-22 21:19:04 +0000 UTC" firstStartedPulling="2026-04-22 21:19:04.504848513 +0000 UTC m=+591.248897190" lastFinishedPulling="2026-04-22 21:19:08.919459862 +0000 UTC m=+595.663508541" observedRunningTime="2026-04-22 21:19:09.982371171 +0000 UTC m=+596.726419870" watchObservedRunningTime="2026-04-22 21:19:09.985636818 +0000 UTC m=+596.729685517" Apr 22 21:19:09.995800 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:09.995754 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" podStartSLOduration=2.100248415 podStartE2EDuration="5.995739319s" podCreationTimestamp="2026-04-22 21:19:04 +0000 UTC" firstStartedPulling="2026-04-22 21:19:05.024437853 +0000 UTC m=+591.768486531" lastFinishedPulling="2026-04-22 21:19:08.919928754 +0000 UTC m=+595.663977435" observedRunningTime="2026-04-22 21:19:09.995087359 +0000 UTC m=+596.739136060" watchObservedRunningTime="2026-04-22 21:19:09.995739319 +0000 UTC m=+596.739788018" Apr 22 21:19:10.968897 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:10.968854 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" podUID="e7491713-c4e1-4e01-9552-f891ab49fb72" containerName="authorino" containerID="cri-o://2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690" gracePeriod=30 Apr 22 21:19:11.210056 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.210030 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:11.271989 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.271899 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4kkc\" (UniqueName: \"kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc\") pod \"e7491713-c4e1-4e01-9552-f891ab49fb72\" (UID: \"e7491713-c4e1-4e01-9552-f891ab49fb72\") " Apr 22 21:19:11.274089 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.274063 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc" (OuterVolumeSpecName: "kube-api-access-t4kkc") pod "e7491713-c4e1-4e01-9552-f891ab49fb72" (UID: "e7491713-c4e1-4e01-9552-f891ab49fb72"). InnerVolumeSpecName "kube-api-access-t4kkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:11.372594 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.372561 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t4kkc\" (UniqueName: \"kubernetes.io/projected/e7491713-c4e1-4e01-9552-f891ab49fb72-kube-api-access-t4kkc\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:19:11.974250 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.974212 2568 generic.go:358] "Generic (PLEG): container finished" podID="e7491713-c4e1-4e01-9552-f891ab49fb72" containerID="2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690" exitCode=0 Apr 22 21:19:11.974679 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.974261 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" Apr 22 21:19:11.974679 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.974292 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" event={"ID":"e7491713-c4e1-4e01-9552-f891ab49fb72","Type":"ContainerDied","Data":"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690"} Apr 22 21:19:11.974679 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.974336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-jks5n" event={"ID":"e7491713-c4e1-4e01-9552-f891ab49fb72","Type":"ContainerDied","Data":"2fba1ea5ad5eddc1ef6fa4809103aa8150f2d787cc658ab9719b0ee31fe62b55"} Apr 22 21:19:11.974679 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.974351 2568 scope.go:117] "RemoveContainer" containerID="2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690" Apr 22 21:19:11.983039 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.983020 2568 scope.go:117] "RemoveContainer" containerID="2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690" Apr 22 21:19:11.983410 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:19:11.983372 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690\": container with ID starting with 2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690 not found: ID does not exist" containerID="2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690" Apr 22 21:19:11.983526 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.983421 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690"} err="failed to get container status \"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690\": rpc error: code = NotFound desc = could not find container \"2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690\": container with ID starting with 2c3f3cd8f4c778c3d7b782bd2cfcdd93aaba31bd8a0d615feee7e8deb4bbd690 not found: ID does not exist" Apr 22 21:19:11.995761 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:11.995727 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:12.000747 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:12.000719 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-jks5n"] Apr 22 21:19:13.756511 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:13.756476 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:19:13.756939 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:13.756569 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:19:13.759226 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:13.759203 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:19:13.759343 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:13.759215 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:19:13.840114 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:13.840080 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7491713-c4e1-4e01-9552-f891ab49fb72" path="/var/lib/kubelet/pods/e7491713-c4e1-4e01-9552-f891ab49fb72/volumes" Apr 22 21:19:20.970378 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:20.970339 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-lj74w" Apr 22 21:19:36.841195 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.841105 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-22t9z"] Apr 22 21:19:36.841683 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.841590 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7491713-c4e1-4e01-9552-f891ab49fb72" containerName="authorino" Apr 22 21:19:36.841683 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.841604 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7491713-c4e1-4e01-9552-f891ab49fb72" containerName="authorino" Apr 22 21:19:36.841683 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.841673 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7491713-c4e1-4e01-9552-f891ab49fb72" containerName="authorino" Apr 22 21:19:36.850262 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.850239 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:36.850514 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.850490 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-22t9z"] Apr 22 21:19:36.852727 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:36.852704 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q9rz8\"" Apr 22 21:19:37.008340 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.008304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2rx8\" (UniqueName: \"kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8\") pod \"authorino-8b475cf9f-22t9z\" (UID: \"d277b698-9b19-48ef-a4f7-f73423fce7e7\") " pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:37.067323 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.067287 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-22t9z"] Apr 22 21:19:37.067608 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:19:37.067586 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-w2rx8], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-22t9z" podUID="d277b698-9b19-48ef-a4f7-f73423fce7e7" Apr 22 21:19:37.090653 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.090618 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:37.093009 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.092990 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:37.098189 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.098159 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:37.109794 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.109764 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2rx8\" (UniqueName: \"kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8\") pod \"authorino-8b475cf9f-22t9z\" (UID: \"d277b698-9b19-48ef-a4f7-f73423fce7e7\") " pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:37.117232 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.117195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2rx8\" (UniqueName: \"kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8\") pod \"authorino-8b475cf9f-22t9z\" (UID: \"d277b698-9b19-48ef-a4f7-f73423fce7e7\") " pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:37.210435 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.210402 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrdc\" (UniqueName: \"kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc\") pod \"authorino-68b6c97f45-zqkml\" (UID: \"43e2dd48-ccaf-4d31-9cb3-48f575b894c9\") " pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:37.311435 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.311370 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrdc\" (UniqueName: \"kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc\") pod \"authorino-68b6c97f45-zqkml\" (UID: \"43e2dd48-ccaf-4d31-9cb3-48f575b894c9\") " pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:37.319088 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.319056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrdc\" (UniqueName: \"kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc\") pod \"authorino-68b6c97f45-zqkml\" (UID: \"43e2dd48-ccaf-4d31-9cb3-48f575b894c9\") " pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:37.348733 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.348647 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:37.348965 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.348952 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:37.474298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:37.474268 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:37.476778 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:19:37.476751 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e2dd48_ccaf_4d31_9cb3_48f575b894c9.slice/crio-aca095d907a599636875a5e4f87c7137e131bad6b5a2160acd95da763b66a8d3 WatchSource:0}: Error finding container aca095d907a599636875a5e4f87c7137e131bad6b5a2160acd95da763b66a8d3: Status 404 returned error can't find the container with id aca095d907a599636875a5e4f87c7137e131bad6b5a2160acd95da763b66a8d3 Apr 22 21:19:38.067385 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.067344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68b6c97f45-zqkml" event={"ID":"43e2dd48-ccaf-4d31-9cb3-48f575b894c9","Type":"ContainerStarted","Data":"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2"} Apr 22 21:19:38.067385 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.067373 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:38.067921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.067418 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68b6c97f45-zqkml" event={"ID":"43e2dd48-ccaf-4d31-9cb3-48f575b894c9","Type":"ContainerStarted","Data":"aca095d907a599636875a5e4f87c7137e131bad6b5a2160acd95da763b66a8d3"} Apr 22 21:19:38.067921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.067356 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-68b6c97f45-zqkml" podUID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" containerName="authorino" containerID="cri-o://7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2" gracePeriod=30 Apr 22 21:19:38.072483 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.072450 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:38.080603 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.080549 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68b6c97f45-zqkml" podStartSLOduration=0.71838674 podStartE2EDuration="1.080533811s" podCreationTimestamp="2026-04-22 21:19:37 +0000 UTC" firstStartedPulling="2026-04-22 21:19:37.478036344 +0000 UTC m=+624.222085025" lastFinishedPulling="2026-04-22 21:19:37.84018342 +0000 UTC m=+624.584232096" observedRunningTime="2026-04-22 21:19:38.080079813 +0000 UTC m=+624.824128513" watchObservedRunningTime="2026-04-22 21:19:38.080533811 +0000 UTC m=+624.824582511" Apr 22 21:19:38.118073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.118048 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2rx8\" (UniqueName: \"kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8\") pod \"d277b698-9b19-48ef-a4f7-f73423fce7e7\" (UID: \"d277b698-9b19-48ef-a4f7-f73423fce7e7\") " Apr 22 21:19:38.120071 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.120045 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8" (OuterVolumeSpecName: "kube-api-access-w2rx8") pod "d277b698-9b19-48ef-a4f7-f73423fce7e7" (UID: "d277b698-9b19-48ef-a4f7-f73423fce7e7"). InnerVolumeSpecName "kube-api-access-w2rx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:38.219740 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.219701 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2rx8\" (UniqueName: \"kubernetes.io/projected/d277b698-9b19-48ef-a4f7-f73423fce7e7-kube-api-access-w2rx8\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:19:38.315327 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.315301 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:38.320628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.320603 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngrdc\" (UniqueName: \"kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc\") pod \"43e2dd48-ccaf-4d31-9cb3-48f575b894c9\" (UID: \"43e2dd48-ccaf-4d31-9cb3-48f575b894c9\") " Apr 22 21:19:38.322762 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.322734 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc" (OuterVolumeSpecName: "kube-api-access-ngrdc") pod "43e2dd48-ccaf-4d31-9cb3-48f575b894c9" (UID: "43e2dd48-ccaf-4d31-9cb3-48f575b894c9"). InnerVolumeSpecName "kube-api-access-ngrdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:38.421800 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:38.421711 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngrdc\" (UniqueName: \"kubernetes.io/projected/43e2dd48-ccaf-4d31-9cb3-48f575b894c9-kube-api-access-ngrdc\") on node \"ip-10-0-143-252.ec2.internal\" DevicePath \"\"" Apr 22 21:19:39.072326 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072287 2568 generic.go:358] "Generic (PLEG): container finished" podID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" containerID="7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2" exitCode=0 Apr 22 21:19:39.072790 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072338 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68b6c97f45-zqkml" Apr 22 21:19:39.072790 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68b6c97f45-zqkml" event={"ID":"43e2dd48-ccaf-4d31-9cb3-48f575b894c9","Type":"ContainerDied","Data":"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2"} Apr 22 21:19:39.072790 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68b6c97f45-zqkml" event={"ID":"43e2dd48-ccaf-4d31-9cb3-48f575b894c9","Type":"ContainerDied","Data":"aca095d907a599636875a5e4f87c7137e131bad6b5a2160acd95da763b66a8d3"} Apr 22 21:19:39.072790 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072432 2568 scope.go:117] "RemoveContainer" containerID="7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2" Apr 22 21:19:39.072790 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.072626 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-22t9z" Apr 22 21:19:39.081157 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.081137 2568 scope.go:117] "RemoveContainer" containerID="7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2" Apr 22 21:19:39.081481 ip-10-0-143-252 kubenswrapper[2568]: E0422 21:19:39.081450 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2\": container with ID starting with 7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2 not found: ID does not exist" containerID="7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2" Apr 22 21:19:39.081592 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.081491 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2"} err="failed to get container status \"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2\": rpc error: code = NotFound desc = could not find container \"7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2\": container with ID starting with 7cc58ad14187ba71a94bad55018c51a3e8448369129e16bfac7b818aa9549da2 not found: ID does not exist" Apr 22 21:19:39.100105 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.100076 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-22t9z"] Apr 22 21:19:39.103362 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.103333 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-22t9z"] Apr 22 21:19:39.112537 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.112508 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:39.117708 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.117683 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-68b6c97f45-zqkml"] Apr 22 21:19:39.853366 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.853325 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" path="/var/lib/kubelet/pods/43e2dd48-ccaf-4d31-9cb3-48f575b894c9/volumes" Apr 22 21:19:39.853770 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:39.853751 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d277b698-9b19-48ef-a4f7-f73423fce7e7" path="/var/lib/kubelet/pods/d277b698-9b19-48ef-a4f7-f73423fce7e7/volumes" Apr 22 21:19:46.223727 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:19:46.223679 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:16.026772 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:16.026736 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:19.024096 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:19.024064 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:28.606073 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.606036 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6b64c7c768-k8t7j"] Apr 22 21:20:28.606533 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.606467 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" containerName="authorino" Apr 22 21:20:28.606533 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.606481 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" containerName="authorino" Apr 22 21:20:28.606608 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.606559 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="43e2dd48-ccaf-4d31-9cb3-48f575b894c9" containerName="authorino" Apr 22 21:20:28.609281 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.609263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.612783 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.612750 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 21:20:28.612783 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.612768 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-p9m6q\"" Apr 22 21:20:28.612964 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.612846 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 21:20:28.618821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.618796 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b64c7c768-k8t7j"] Apr 22 21:20:28.691275 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.691239 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-maas-api-tls\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.691275 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.691276 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27tt6\" (UniqueName: \"kubernetes.io/projected/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-kube-api-access-27tt6\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.792550 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.792512 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-maas-api-tls\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.792550 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.792554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27tt6\" (UniqueName: \"kubernetes.io/projected/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-kube-api-access-27tt6\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.795297 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.795268 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-maas-api-tls\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.800229 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.800210 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27tt6\" (UniqueName: \"kubernetes.io/projected/9df94eeb-cc27-44af-ae1f-b38dfc0dd7da-kube-api-access-27tt6\") pod \"maas-api-6b64c7c768-k8t7j\" (UID: \"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da\") " pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:28.922821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:28.922719 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:29.032568 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:29.032535 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:29.255182 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:29.255147 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6b64c7c768-k8t7j"] Apr 22 21:20:29.257502 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:20:29.257470 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df94eeb_cc27_44af_ae1f_b38dfc0dd7da.slice/crio-fb39563f20323292214c4dd70b40448b1f5418737d5033368be3a1803c4c6289 WatchSource:0}: Error finding container fb39563f20323292214c4dd70b40448b1f5418737d5033368be3a1803c4c6289: Status 404 returned error can't find the container with id fb39563f20323292214c4dd70b40448b1f5418737d5033368be3a1803c4c6289 Apr 22 21:20:29.258750 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:29.258730 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:20:30.262444 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:30.262383 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b64c7c768-k8t7j" event={"ID":"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da","Type":"ContainerStarted","Data":"fb39563f20323292214c4dd70b40448b1f5418737d5033368be3a1803c4c6289"} Apr 22 21:20:32.027954 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.027913 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:32.272047 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.272007 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6b64c7c768-k8t7j" event={"ID":"9df94eeb-cc27-44af-ae1f-b38dfc0dd7da","Type":"ContainerStarted","Data":"9f12e0c28b5161ab37675807096236e9feaa3ebcef762c44f988dddff7ecd1cb"} Apr 22 21:20:32.272221 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.272136 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:32.275445 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.275417 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d"] Apr 22 21:20:32.278047 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.278000 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.280276 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.280255 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-8wbgm\"" Apr 22 21:20:32.280408 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.280281 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 21:20:32.280408 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.280364 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 21:20:32.281208 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.281163 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 21:20:32.286901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.286878 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d"] Apr 22 21:20:32.290337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.290263 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6b64c7c768-k8t7j" podStartSLOduration=2.029468824 podStartE2EDuration="4.290246578s" podCreationTimestamp="2026-04-22 21:20:28 +0000 UTC" firstStartedPulling="2026-04-22 21:20:29.258854139 +0000 UTC m=+676.002902819" lastFinishedPulling="2026-04-22 21:20:31.519631891 +0000 UTC m=+678.263680573" observedRunningTime="2026-04-22 21:20:32.289574101 +0000 UTC m=+679.033622802" watchObservedRunningTime="2026-04-22 21:20:32.290246578 +0000 UTC m=+679.034295277" Apr 22 21:20:32.326003 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.325958 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zmm\" (UniqueName: \"kubernetes.io/projected/24b96d7b-4309-49db-baca-abde1225826f-kube-api-access-82zmm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.326202 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.326028 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b96d7b-4309-49db-baca-abde1225826f-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.326202 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.326054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.326202 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.326093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.326356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.326208 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.326356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.326263 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427122 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427078 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427298 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427267 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82zmm\" (UniqueName: \"kubernetes.io/projected/24b96d7b-4309-49db-baca-abde1225826f-kube-api-access-82zmm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427525 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b96d7b-4309-49db-baca-abde1225826f-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427525 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427345 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427595 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427575 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427661 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427639 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.427808 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.427789 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.429538 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.429517 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b96d7b-4309-49db-baca-abde1225826f-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.429822 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.429808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b96d7b-4309-49db-baca-abde1225826f-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.435661 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.435642 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zmm\" (UniqueName: \"kubernetes.io/projected/24b96d7b-4309-49db-baca-abde1225826f-kube-api-access-82zmm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-gvx9d\" (UID: \"24b96d7b-4309-49db-baca-abde1225826f\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.589987 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.589898 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:32.722633 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:32.722605 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d"] Apr 22 21:20:32.725103 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:20:32.725066 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b96d7b_4309_49db_baca_abde1225826f.slice/crio-8702cfe98520dd496a14efccc5df68be356be64c1720ae16b1b0e7a2a771c1fc WatchSource:0}: Error finding container 8702cfe98520dd496a14efccc5df68be356be64c1720ae16b1b0e7a2a771c1fc: Status 404 returned error can't find the container with id 8702cfe98520dd496a14efccc5df68be356be64c1720ae16b1b0e7a2a771c1fc Apr 22 21:20:33.278201 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:33.278159 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" event={"ID":"24b96d7b-4309-49db-baca-abde1225826f","Type":"ContainerStarted","Data":"8702cfe98520dd496a14efccc5df68be356be64c1720ae16b1b0e7a2a771c1fc"} Apr 22 21:20:36.636764 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:36.636595 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:38.283952 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:38.283921 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6b64c7c768-k8t7j" Apr 22 21:20:38.299957 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:38.299919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" event={"ID":"24b96d7b-4309-49db-baca-abde1225826f","Type":"ContainerStarted","Data":"78ed7fd14de15ba2efd0d173fca4cd542658f6496cbe49b058d3a1a87613e2a4"} Apr 22 21:20:43.320352 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:43.320320 2568 generic.go:358] "Generic (PLEG): container finished" podID="24b96d7b-4309-49db-baca-abde1225826f" containerID="78ed7fd14de15ba2efd0d173fca4cd542658f6496cbe49b058d3a1a87613e2a4" exitCode=0 Apr 22 21:20:43.320717 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:43.320408 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" event={"ID":"24b96d7b-4309-49db-baca-abde1225826f","Type":"ContainerDied","Data":"78ed7fd14de15ba2efd0d173fca4cd542658f6496cbe49b058d3a1a87613e2a4"} Apr 22 21:20:45.330229 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:45.330191 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" event={"ID":"24b96d7b-4309-49db-baca-abde1225826f","Type":"ContainerStarted","Data":"d0d1131bdef903c88121235dc28443019fc398b71b6b27529e4c0547b18f3097"} Apr 22 21:20:45.330713 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:45.330443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:20:45.348180 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:45.348121 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" podStartSLOduration=1.687803347 podStartE2EDuration="13.348102948s" podCreationTimestamp="2026-04-22 21:20:32 +0000 UTC" firstStartedPulling="2026-04-22 21:20:32.726982913 +0000 UTC m=+679.471031590" lastFinishedPulling="2026-04-22 21:20:44.387282502 +0000 UTC m=+691.131331191" observedRunningTime="2026-04-22 21:20:45.346457224 +0000 UTC m=+692.090505926" watchObservedRunningTime="2026-04-22 21:20:45.348102948 +0000 UTC m=+692.092151649" Apr 22 21:20:47.726696 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:47.726653 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:20:56.347531 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:20:56.347494 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-gvx9d" Apr 22 21:21:44.313910 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.313870 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-dd6fb94d8-vf227"] Apr 22 21:21:44.317517 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.317495 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.320796 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.320776 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q9rz8\"" Apr 22 21:21:44.320883 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.320799 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 21:21:44.324552 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.324509 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dd6fb94d8-vf227"] Apr 22 21:21:44.413809 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.413767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/61814199-f038-4265-a5ee-e967e457c3f4-tls-cert\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.413809 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.413809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hps5\" (UniqueName: \"kubernetes.io/projected/61814199-f038-4265-a5ee-e967e457c3f4-kube-api-access-2hps5\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.514386 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.514349 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/61814199-f038-4265-a5ee-e967e457c3f4-tls-cert\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.514386 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.514417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hps5\" (UniqueName: \"kubernetes.io/projected/61814199-f038-4265-a5ee-e967e457c3f4-kube-api-access-2hps5\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.516921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.516893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/61814199-f038-4265-a5ee-e967e457c3f4-tls-cert\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.522033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.522004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hps5\" (UniqueName: \"kubernetes.io/projected/61814199-f038-4265-a5ee-e967e457c3f4-kube-api-access-2hps5\") pod \"authorino-dd6fb94d8-vf227\" (UID: \"61814199-f038-4265-a5ee-e967e457c3f4\") " pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.628378 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.628348 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dd6fb94d8-vf227" Apr 22 21:21:44.763062 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:44.762993 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dd6fb94d8-vf227"] Apr 22 21:21:44.765471 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:21:44.765443 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61814199_f038_4265_a5ee_e967e457c3f4.slice/crio-2b7ace2910dbaff8ad3a519f40c34b4e994941815e64c6f8514520262eed952c WatchSource:0}: Error finding container 2b7ace2910dbaff8ad3a519f40c34b4e994941815e64c6f8514520262eed952c: Status 404 returned error can't find the container with id 2b7ace2910dbaff8ad3a519f40c34b4e994941815e64c6f8514520262eed952c Apr 22 21:21:45.553902 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:45.553809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dd6fb94d8-vf227" event={"ID":"61814199-f038-4265-a5ee-e967e457c3f4","Type":"ContainerStarted","Data":"aac20a0dc0505bd6e771ebd791dfba498d2b8f88faf9cef0cad953fb0d7d5ec1"} Apr 22 21:21:45.553902 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:21:45.553854 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dd6fb94d8-vf227" event={"ID":"61814199-f038-4265-a5ee-e967e457c3f4","Type":"ContainerStarted","Data":"2b7ace2910dbaff8ad3a519f40c34b4e994941815e64c6f8514520262eed952c"} Apr 22 21:22:01.932086 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:01.932008 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-dd6fb94d8-vf227" podStartSLOduration=17.480853539 podStartE2EDuration="17.931986513s" podCreationTimestamp="2026-04-22 21:21:44 +0000 UTC" firstStartedPulling="2026-04-22 21:21:44.766648601 +0000 UTC m=+751.510697277" lastFinishedPulling="2026-04-22 21:21:45.21778157 +0000 UTC m=+751.961830251" observedRunningTime="2026-04-22 21:21:45.573901772 +0000 UTC m=+752.317950473" watchObservedRunningTime="2026-04-22 21:22:01.931986513 +0000 UTC m=+768.676035213" Apr 22 21:22:01.933239 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:01.933213 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:06.225603 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:06.225569 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:13.533524 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:13.533482 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:24.224554 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:24.224515 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:32.032758 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:32.032679 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:42.725461 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:42.725424 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:22:51.524492 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:22:51.524450 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:23:02.025099 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:23:02.025055 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:24:03.644185 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:03.644082 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:24:13.784211 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:13.784184 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:24:13.784985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:13.784963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:24:13.787145 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:13.787122 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:24:13.787719 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:13.787694 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:24:19.031060 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:19.031022 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:24:56.830517 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:24:56.830479 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:25:14.120584 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:25:14.120539 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:25:28.231621 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:25:28.231533 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:25:43.727513 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:25:43.727476 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:26:38.036434 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:26:38.035766 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:26:46.326018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:26:46.325977 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:27:04.029091 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:27:04.029002 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:27:12.225313 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:27:12.225238 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:27:29.823454 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:27:29.823419 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:27:37.328308 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:27:37.328269 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:28:10.227997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:28:10.227964 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:28:19.128702 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:28:19.128664 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:28:26.430337 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:28:26.430294 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:28:35.123245 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:28:35.123165 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:28:43.224219 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:28:43.224181 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:29:00.424701 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:00.424663 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:29:11.321738 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:11.321700 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:29:13.817487 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:13.817456 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:29:13.818003 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:13.817599 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:29:13.820255 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:13.820236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:29:13.820568 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:13.820554 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:29:58.332510 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:29:58.332412 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:06.829714 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:06.829670 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:15.725474 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:15.725431 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:24.322055 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:24.322015 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:32.726635 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:32.726587 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:40.932227 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:40.932192 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:50.036643 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:50.036606 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:54.924442 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:54.924385 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:30:59.033241 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:30:59.033200 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:08.830431 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:08.830382 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:16.529501 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:16.529463 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:25.725261 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:25.725220 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:34.431110 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:34.431020 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:43.220919 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:43.220877 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:31:51.529027 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:31:51.528987 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:32:00.231687 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:32:00.231647 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:32:08.422971 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:32:08.422926 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:32:17.635548 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:32:17.635508 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:32:25.525410 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:32:25.525352 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:34:13.846523 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:13.846495 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:34:13.847069 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:13.846728 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:34:13.852349 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:13.852323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:34:13.852554 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:13.852330 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:34:44.230093 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:44.230054 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:34:49.426234 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:34:49.426194 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:14.129532 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:14.129498 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:18.729700 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:18.729658 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:28.238033 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:28.237986 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:39.537577 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:39.537538 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:47.229502 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:47.229453 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:35:59.224600 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:35:59.224517 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:36:07.332692 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:36:07.332650 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:36:18.125448 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:36:18.125409 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:36:26.731997 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:36:26.731962 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:36:37.329770 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:36:37.329726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:36:46.426332 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:36:46.426297 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:37:20.828437 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:37:20.827710 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:03.432143 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:03.432044 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:12.431431 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:12.431375 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:21.334175 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:21.334137 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:29.128317 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:29.128271 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:38.629085 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:38.629047 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:50.929628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:50.929590 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:38:59.727217 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:38:59.727129 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:39:08.026768 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:08.026725 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:39:13.880702 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:13.880669 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:39:13.881614 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:13.881586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:39:13.883543 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:13.883519 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:39:13.884220 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:13.884200 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:39:15.934782 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:15.934745 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:39:23.627553 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:23.627507 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:39:32.425941 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:32.425895 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:39:42.930314 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:39:42.930274 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:01.133010 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:01.132965 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:09.524204 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:09.524158 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:18.425466 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:18.425432 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:25.724887 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:25.724846 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:43.730000 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:43.729916 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:40:51.731755 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:40:51.731706 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:00.530139 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:00.530099 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:08.135072 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:08.135027 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:17.430045 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:17.429972 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:25.525036 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:25.524997 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:34.730127 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:34.730084 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:46.533782 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:46.533739 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:41:55.638908 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:41:55.638870 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:09.159245 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:09.159165 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:17.140501 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:17.140462 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:24.335722 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:24.335680 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:33.648953 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:33.648911 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:41.756487 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:41.756449 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:42:58.432962 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:42:58.432924 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:43:06.974826 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:43:06.974784 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:43:15.875815 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:43:15.875776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:43:23.749925 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:43:23.749881 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:43:48.139420 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:43:48.139321 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:44:00.349693 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:00.349655 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-lj74w"] Apr 22 21:44:01.811567 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:01.811523 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-dd6fb94d8-vf227_61814199-f038-4265-a5ee-e967e457c3f4/authorino/0.log" Apr 22 21:44:06.116468 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:06.116435 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6b64c7c768-k8t7j_9df94eeb-cc27-44af-ae1f-b38dfc0dd7da/maas-api/0.log" Apr 22 21:44:06.577596 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:06.577522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65d8664856-rdsxk_948cd868-138f-4d97-8db2-f99ecb3f2f0b/manager/0.log" Apr 22 21:44:08.042182 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:08.042144 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-dd6fb94d8-vf227_61814199-f038-4265-a5ee-e967e457c3f4/authorino/0.log" Apr 22 21:44:08.719738 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:08.719708 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lj74w_38f62bb4-aaae-453a-8592-dfafb12cc5f1/limitador/0.log" Apr 22 21:44:09.387174 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:09.387142 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b8c5f7f67-9jf78_a49e7b51-7394-4b9e-aac2-fc8c2586780f/kube-auth-proxy/0.log" Apr 22 21:44:09.615785 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:09.615754 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-b2wxv_78d27d4c-e729-427f-a1aa-326c30f8fbab/istio-proxy/0.log" Apr 22 21:44:10.269512 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:10.269470 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gvx9d_24b96d7b-4309-49db-baca-abde1225826f/storage-initializer/0.log" Apr 22 21:44:10.275881 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:10.275857 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-gvx9d_24b96d7b-4309-49db-baca-abde1225826f/main/0.log" Apr 22 21:44:13.907191 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:13.907165 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:44:13.909080 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:13.909057 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:44:13.909947 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:13.909927 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:44:13.911637 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:13.911620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:44:14.361243 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.361206 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ddjjb/must-gather-gd86t"] Apr 22 21:44:14.365628 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.365601 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.368116 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.368092 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ddjjb\"/\"default-dockercfg-kv5l8\"" Apr 22 21:44:14.369179 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.369153 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ddjjb\"/\"openshift-service-ca.crt\"" Apr 22 21:44:14.369307 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.369164 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ddjjb\"/\"kube-root-ca.crt\"" Apr 22 21:44:14.382801 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.382773 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/must-gather-gd86t"] Apr 22 21:44:14.486405 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.486367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjprc\" (UniqueName: \"kubernetes.io/projected/0226009c-c84e-4073-b7fd-77244a4eb3d7-kube-api-access-cjprc\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.486589 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.486537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0226009c-c84e-4073-b7fd-77244a4eb3d7-must-gather-output\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.588090 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.588051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjprc\" (UniqueName: \"kubernetes.io/projected/0226009c-c84e-4073-b7fd-77244a4eb3d7-kube-api-access-cjprc\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.588309 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.588138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0226009c-c84e-4073-b7fd-77244a4eb3d7-must-gather-output\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.588538 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.588514 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0226009c-c84e-4073-b7fd-77244a4eb3d7-must-gather-output\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.596998 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.596967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjprc\" (UniqueName: \"kubernetes.io/projected/0226009c-c84e-4073-b7fd-77244a4eb3d7-kube-api-access-cjprc\") pod \"must-gather-gd86t\" (UID: \"0226009c-c84e-4073-b7fd-77244a4eb3d7\") " pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.676050 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.675954 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/must-gather-gd86t" Apr 22 21:44:14.802803 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.802777 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/must-gather-gd86t"] Apr 22 21:44:14.805044 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:44:14.805009 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0226009c_c84e_4073_b7fd_77244a4eb3d7.slice/crio-5dc62f07ff2ea1184c71c3ab9a30253c8770d7d97d7d6a0bbcc9600011539c28 WatchSource:0}: Error finding container 5dc62f07ff2ea1184c71c3ab9a30253c8770d7d97d7d6a0bbcc9600011539c28: Status 404 returned error can't find the container with id 5dc62f07ff2ea1184c71c3ab9a30253c8770d7d97d7d6a0bbcc9600011539c28 Apr 22 21:44:14.806822 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:14.806799 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:44:15.581287 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:15.581252 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/must-gather-gd86t" event={"ID":"0226009c-c84e-4073-b7fd-77244a4eb3d7","Type":"ContainerStarted","Data":"5dc62f07ff2ea1184c71c3ab9a30253c8770d7d97d7d6a0bbcc9600011539c28"} Apr 22 21:44:16.587451 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:16.587371 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/must-gather-gd86t" event={"ID":"0226009c-c84e-4073-b7fd-77244a4eb3d7","Type":"ContainerStarted","Data":"0e8ac5834d06f3f42a15a550949c9d43c5b02f4155f2394b6654a01886649622"} Apr 22 21:44:16.587919 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:16.587462 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/must-gather-gd86t" event={"ID":"0226009c-c84e-4073-b7fd-77244a4eb3d7","Type":"ContainerStarted","Data":"2c6315eae8d1d30245144088ad89a6f5583f0345ca054d689429e2b1658b4abd"} Apr 22 21:44:16.606009 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:16.605935 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ddjjb/must-gather-gd86t" podStartSLOduration=1.802546747 podStartE2EDuration="2.60591253s" podCreationTimestamp="2026-04-22 21:44:14 +0000 UTC" firstStartedPulling="2026-04-22 21:44:14.806973233 +0000 UTC m=+2101.551021909" lastFinishedPulling="2026-04-22 21:44:15.610339012 +0000 UTC m=+2102.354387692" observedRunningTime="2026-04-22 21:44:16.604024319 +0000 UTC m=+2103.348073019" watchObservedRunningTime="2026-04-22 21:44:16.60591253 +0000 UTC m=+2103.349961229" Apr 22 21:44:17.178953 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:17.178921 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gkkxr_62f5ce17-e153-446d-9866-1da5180f3d9a/global-pull-secret-syncer/0.log" Apr 22 21:44:17.328532 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:17.328491 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7znnp_2d04c13d-b019-4af4-9237-79c3ecb7fde8/konnectivity-agent/0.log" Apr 22 21:44:17.442135 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:17.442053 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-252.ec2.internal_65d514ad771afddf9d09f127dfba4f00/haproxy/0.log" Apr 22 21:44:21.620748 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:21.620713 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-dd6fb94d8-vf227_61814199-f038-4265-a5ee-e967e457c3f4/authorino/0.log" Apr 22 21:44:21.806837 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:21.806803 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-lj74w_38f62bb4-aaae-453a-8592-dfafb12cc5f1/limitador/0.log" Apr 22 21:44:23.241695 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.241628 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/alertmanager/0.log" Apr 22 21:44:23.268146 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.268112 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/config-reloader/0.log" Apr 22 21:44:23.293781 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.293507 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/kube-rbac-proxy-web/0.log" Apr 22 21:44:23.326383 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.326353 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/kube-rbac-proxy/0.log" Apr 22 21:44:23.362440 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.362349 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/kube-rbac-proxy-metric/0.log" Apr 22 21:44:23.383885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.383846 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/prom-label-proxy/0.log" Apr 22 21:44:23.412824 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.412738 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_229bff03-46de-4d1a-b214-0574275ea562/init-config-reloader/0.log" Apr 22 21:44:23.455524 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.455483 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-njxpb_533714a1-f27e-40c7-8284-efe7ee67acf7/cluster-monitoring-operator/0.log" Apr 22 21:44:23.479778 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.479743 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c5f6s_ba347a0d-5022-49b0-bbe4-1cb18755020c/kube-state-metrics/0.log" Apr 22 21:44:23.501419 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.501308 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c5f6s_ba347a0d-5022-49b0-bbe4-1cb18755020c/kube-rbac-proxy-main/0.log" Apr 22 21:44:23.522055 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.521980 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-c5f6s_ba347a0d-5022-49b0-bbe4-1cb18755020c/kube-rbac-proxy-self/0.log" Apr 22 21:44:23.549906 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.549874 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7fd898795-zn7t9_510cb629-1f16-4d62-b114-d87845a195c6/metrics-server/0.log" Apr 22 21:44:23.601612 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.601586 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltfpz_f2c3bfd7-d56b-43d2-a164-4289b4e780d6/node-exporter/0.log" Apr 22 21:44:23.618082 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.618043 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltfpz_f2c3bfd7-d56b-43d2-a164-4289b4e780d6/kube-rbac-proxy/0.log" Apr 22 21:44:23.635926 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.635900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ltfpz_f2c3bfd7-d56b-43d2-a164-4289b4e780d6/init-textfile/0.log" Apr 22 21:44:23.773646 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.773543 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wxhxt_168b8bff-402c-4cdc-9b0e-56436a4fd9d8/kube-rbac-proxy-main/0.log" Apr 22 21:44:23.791224 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.791198 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wxhxt_168b8bff-402c-4cdc-9b0e-56436a4fd9d8/kube-rbac-proxy-self/0.log" Apr 22 21:44:23.811297 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:23.811270 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-wxhxt_168b8bff-402c-4cdc-9b0e-56436a4fd9d8/openshift-state-metrics/0.log" Apr 22 21:44:24.065492 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.065412 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-664wc_63666cc7-3705-48c1-b0fd-3a0071a0c4de/prometheus-operator/0.log" Apr 22 21:44:24.105552 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.105522 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-664wc_63666cc7-3705-48c1-b0fd-3a0071a0c4de/kube-rbac-proxy/0.log" Apr 22 21:44:24.184523 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.184494 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54bbfcf75c-kqd4g_16f44f55-57f0-4a60-b4a0-58b6921842e0/telemeter-client/0.log" Apr 22 21:44:24.222101 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.222074 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54bbfcf75c-kqd4g_16f44f55-57f0-4a60-b4a0-58b6921842e0/reload/0.log" Apr 22 21:44:24.244153 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.244118 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54bbfcf75c-kqd4g_16f44f55-57f0-4a60-b4a0-58b6921842e0/kube-rbac-proxy/0.log" Apr 22 21:44:24.279003 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.278967 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/thanos-query/0.log" Apr 22 21:44:24.297617 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.297582 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/kube-rbac-proxy-web/0.log" Apr 22 21:44:24.333601 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.333512 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/kube-rbac-proxy/0.log" Apr 22 21:44:24.350156 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.350127 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/prom-label-proxy/0.log" Apr 22 21:44:24.366273 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.366239 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/kube-rbac-proxy-rules/0.log" Apr 22 21:44:24.392021 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:24.391349 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-69bdf86478-lhs6r_4fd1aa70-782d-45cd-8d2d-dd1426761edb/kube-rbac-proxy-metrics/0.log" Apr 22 21:44:25.537068 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:25.537040 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-zqj22_b546034a-3b47-42da-a1d5-9685baf19e4f/networking-console-plugin/0.log" Apr 22 21:44:26.058018 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.057984 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/2.log" Apr 22 21:44:26.060862 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.060833 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-sc42w_33c46c61-c2a6-4c05-bf38-c25734b80329/console-operator/3.log" Apr 22 21:44:26.111041 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.110999 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5"] Apr 22 21:44:26.119557 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.119525 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.120921 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.120894 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5"] Apr 22 21:44:26.213006 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.212972 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgbgg\" (UniqueName: \"kubernetes.io/projected/ee1deccd-8a6c-4143-8e8c-a96a164bea55-kube-api-access-cgbgg\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.213175 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.213026 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-podres\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.213175 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.213058 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-lib-modules\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.213175 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.213146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-sys\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.213283 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.213184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-proc\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314587 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314500 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgbgg\" (UniqueName: \"kubernetes.io/projected/ee1deccd-8a6c-4143-8e8c-a96a164bea55-kube-api-access-cgbgg\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314587 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-podres\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-lib-modules\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314627 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-sys\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314651 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-proc\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-podres\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-lib-modules\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314777 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-proc\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.314821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.314822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee1deccd-8a6c-4143-8e8c-a96a164bea55-sys\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.323125 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.323096 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgbgg\" (UniqueName: \"kubernetes.io/projected/ee1deccd-8a6c-4143-8e8c-a96a164bea55-kube-api-access-cgbgg\") pod \"perf-node-gather-daemonset-9l5z5\" (UID: \"ee1deccd-8a6c-4143-8e8c-a96a164bea55\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.436632 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.436579 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:26.567264 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.567136 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cd9b9bfcc-pnhzv_911b8efe-a3a8-4725-b945-cd5e0976f559/console/0.log" Apr 22 21:44:26.596792 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.596760 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-9gw7r_d077eb7b-f486-4c86-a813-5f7063fd7016/download-server/0.log" Apr 22 21:44:26.598514 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.598493 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5"] Apr 22 21:44:26.600667 ip-10-0-143-252 kubenswrapper[2568]: W0422 21:44:26.600631 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podee1deccd_8a6c_4143_8e8c_a96a164bea55.slice/crio-ed334947c81906be456c362f159d3f99ac54ca98f478999c048f004c6a3cccba WatchSource:0}: Error finding container ed334947c81906be456c362f159d3f99ac54ca98f478999c048f004c6a3cccba: Status 404 returned error can't find the container with id ed334947c81906be456c362f159d3f99ac54ca98f478999c048f004c6a3cccba Apr 22 21:44:26.638659 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:26.638235 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" event={"ID":"ee1deccd-8a6c-4143-8e8c-a96a164bea55","Type":"ContainerStarted","Data":"ed334947c81906be456c362f159d3f99ac54ca98f478999c048f004c6a3cccba"} Apr 22 21:44:27.643356 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:27.643323 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" event={"ID":"ee1deccd-8a6c-4143-8e8c-a96a164bea55","Type":"ContainerStarted","Data":"fb23c861d07a39bb8451a4f072fd40f09c72348390c974eb152cd8353c796bf4"} Apr 22 21:44:27.643768 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:27.643454 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:27.661304 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:27.661248 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" podStartSLOduration=1.661231336 podStartE2EDuration="1.661231336s" podCreationTimestamp="2026-04-22 21:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:44:27.658563073 +0000 UTC m=+2114.402611783" watchObservedRunningTime="2026-04-22 21:44:27.661231336 +0000 UTC m=+2114.405280031" Apr 22 21:44:27.967557 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:27.967483 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qvbng_fd364cec-0032-4596-8b42-09cb588be2ad/dns/0.log" Apr 22 21:44:27.985889 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:27.985859 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qvbng_fd364cec-0032-4596-8b42-09cb588be2ad/kube-rbac-proxy/0.log" Apr 22 21:44:28.004982 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:28.004956 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2ctrd_04aad4f5-cc85-41af-a470-1fa752e56411/dns-node-resolver/0.log" Apr 22 21:44:28.532020 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:28.531973 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-thbtv_91e31d0a-9404-4d86-a9a6-1f28187dbd99/node-ca/0.log" Apr 22 21:44:29.443350 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:29.443319 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b8c5f7f67-9jf78_a49e7b51-7394-4b9e-aac2-fc8c2586780f/kube-auth-proxy/0.log" Apr 22 21:44:29.519821 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:29.519784 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-b2wxv_78d27d4c-e729-427f-a1aa-326c30f8fbab/istio-proxy/0.log" Apr 22 21:44:30.031617 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:30.031590 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-m9f26_cd68afce-7631-4765-af7c-a614caf39491/serve-healthcheck-canary/0.log" Apr 22 21:44:30.601519 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:30.601489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kdgqq_befc466d-f224-4b13-8b92-963767ecc9a0/kube-rbac-proxy/0.log" Apr 22 21:44:30.618331 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:30.618296 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kdgqq_befc466d-f224-4b13-8b92-963767ecc9a0/exporter/0.log" Apr 22 21:44:30.635540 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:30.635510 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kdgqq_befc466d-f224-4b13-8b92-963767ecc9a0/extractor/0.log" Apr 22 21:44:32.587925 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:32.587892 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6b64c7c768-k8t7j_9df94eeb-cc27-44af-ae1f-b38dfc0dd7da/maas-api/0.log" Apr 22 21:44:32.737498 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:32.737464 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65d8664856-rdsxk_948cd868-138f-4d97-8db2-f99ecb3f2f0b/manager/0.log" Apr 22 21:44:33.656898 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:33.656865 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-9l5z5" Apr 22 21:44:33.940521 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:33.940436 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5db7bf5949-sqdz6_b84866af-bef1-4d0c-aa5c-ff1de43156ca/manager/0.log" Apr 22 21:44:33.959401 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:33.959355 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-cmtnw_7925b8e2-1630-4189-9f93-745459937e16/openshift-lws-operator/0.log" Apr 22 21:44:38.228836 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:38.228763 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vhcs8_dcd1d000-ff39-4932-83e0-62a9d1c5575b/migrator/0.log" Apr 22 21:44:38.244478 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:38.244448 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vhcs8_dcd1d000-ff39-4932-83e0-62a9d1c5575b/graceful-termination/0.log" Apr 22 21:44:39.529540 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.529512 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/kube-multus-additional-cni-plugins/0.log" Apr 22 21:44:39.547797 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.547770 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/egress-router-binary-copy/0.log" Apr 22 21:44:39.563496 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.563466 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/cni-plugins/0.log" Apr 22 21:44:39.579901 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.579869 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/bond-cni-plugin/0.log" Apr 22 21:44:39.598200 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.598176 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/routeoverride-cni/0.log" Apr 22 21:44:39.615444 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.615413 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/whereabouts-cni-bincopy/0.log" Apr 22 21:44:39.628537 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.628506 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dr8fz_f02b6849-41ba-4491-9415-4a546ba5e3bb/whereabouts-cni/0.log" Apr 22 21:44:39.896556 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.896526 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b85gf_d5843d3e-a9c1-40f4-918c-77998582dbee/kube-multus/0.log" Apr 22 21:44:39.999150 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:39.999122 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hptqt_605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f/network-metrics-daemon/0.log" Apr 22 21:44:40.013885 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:40.013859 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hptqt_605a4e19-b663-46e8-9fcc-e4bd8f2e9c4f/kube-rbac-proxy/0.log" Apr 22 21:44:41.066985 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.066945 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-controller/0.log" Apr 22 21:44:41.087037 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.087000 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/0.log" Apr 22 21:44:41.096593 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.096563 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovn-acl-logging/1.log" Apr 22 21:44:41.112961 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.112933 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/kube-rbac-proxy-node/0.log" Apr 22 21:44:41.131689 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.131663 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 21:44:41.148603 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.148576 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/northd/0.log" Apr 22 21:44:41.164550 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.164523 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/nbdb/0.log" Apr 22 21:44:41.181172 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.181144 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/sbdb/0.log" Apr 22 21:44:41.289541 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:41.289509 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j6pw2_7024fa13-11c9-4df5-bb63-12212dd14ff1/ovnkube-controller/0.log" Apr 22 21:44:42.631178 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:42.631138 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-jhrq6_50847da2-6189-4704-b652-a6ab02809bf2/check-endpoints/0.log" Apr 22 21:44:42.688892 ip-10-0-143-252 kubenswrapper[2568]: I0422 21:44:42.688861 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-s8cvq_68e95be5-6911-44d9-88c0-a14e0becfcb5/network-check-target-container/0.log"