Apr 20 21:44:37.086978 ip-10-0-140-110 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 21:44:37.086992 ip-10-0-140-110 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 21:44:37.087001 ip-10-0-140-110 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 21:44:37.087278 ip-10-0-140-110 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 21:44:47.247387 ip-10-0-140-110 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 21:44:47.247407 ip-10-0-140-110 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c2d89285c03749769bd352bd5fcee224 -- Apr 20 21:47:09.719676 ip-10-0-140-110 systemd[1]: Starting Kubernetes Kubelet... Apr 20 21:47:10.179938 ip-10-0-140-110 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:10.179938 ip-10-0-140-110 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 21:47:10.179938 ip-10-0-140-110 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:10.179938 ip-10-0-140-110 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 21:47:10.179938 ip-10-0-140-110 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 21:47:10.181906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.181797 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 21:47:10.185100 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185079 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:10.185100 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185097 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:10.185100 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185102 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:10.185100 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185107 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185111 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185115 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185119 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185123 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185127 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185130 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185133 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185137 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185141 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185145 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185148 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185152 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185157 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185160 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185164 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185176 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185181 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185186 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185190 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:10.185337 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185193 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185197 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185201 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185205 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185209 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185213 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185217 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185221 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185226 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185230 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185234 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185238 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185242 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185246 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185252 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185256 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185271 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185276 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185280 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185284 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:10.186189 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185288 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185300 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185304 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185308 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185314 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185318 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185323 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185327 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185331 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185335 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185339 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185343 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185347 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185351 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185357 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185363 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185389 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185397 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185402 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:10.187043 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185407 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185412 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185422 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185426 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185433 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185438 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185443 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185448 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185453 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185457 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185461 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185465 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185469 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185473 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185478 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185484 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185488 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185492 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185497 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185501 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:10.187862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185505 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185509 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185513 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.185517 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186114 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186121 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186125 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186130 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186134 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186139 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186144 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186149 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186154 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186159 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186163 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186167 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186173 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186178 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186182 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186187 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:10.188516 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186191 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186195 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186199 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186203 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186208 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186211 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186215 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186219 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186224 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186228 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186232 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186239 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186245 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186251 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186256 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186260 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186265 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186270 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186275 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186279 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:10.189190 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186283 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186287 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186291 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186296 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186300 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186305 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186310 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186314 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186320 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186325 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186329 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186333 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186338 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186342 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186346 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186350 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186354 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186358 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186362 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186385 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:10.189788 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186390 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186394 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186399 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186403 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186407 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186413 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186417 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186421 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186425 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186429 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186434 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186438 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186442 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186446 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186450 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186459 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186463 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186468 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186472 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186476 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:10.190352 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186481 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186487 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186492 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186496 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186500 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186504 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186508 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186513 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186517 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.186521 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186631 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186643 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186653 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186659 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186666 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186671 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186678 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186685 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186690 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186695 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186701 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 21:47:10.190880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186706 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186711 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186715 2574 flags.go:64] FLAG: --cgroup-root="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186720 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186725 2574 flags.go:64] FLAG: --client-ca-file="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186730 2574 flags.go:64] FLAG: --cloud-config="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186737 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186742 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186749 2574 flags.go:64] FLAG: --cluster-domain="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186754 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186759 2574 flags.go:64] FLAG: --config-dir="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186764 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186769 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186776 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186781 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186786 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186792 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186797 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186802 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186807 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186812 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186816 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186823 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186828 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186833 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 21:47:10.191416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186837 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186842 2574 flags.go:64] FLAG: --enable-server="true" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186847 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186856 2574 flags.go:64] FLAG: --event-burst="100" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186861 2574 flags.go:64] FLAG: --event-qps="50" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186868 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186872 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186877 2574 flags.go:64] FLAG: --eviction-hard="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186883 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186888 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186893 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186898 2574 flags.go:64] FLAG: --eviction-soft="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186903 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186910 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186914 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186919 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186924 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186929 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186934 2574 flags.go:64] FLAG: --feature-gates="" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186940 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186945 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186950 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186960 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186966 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186971 2574 flags.go:64] FLAG: --help="false" Apr 20 21:47:10.192013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186976 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186981 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186986 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186991 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.186997 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187002 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187007 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187012 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187017 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187021 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187026 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187031 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187038 2574 flags.go:64] FLAG: --kube-reserved="" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187043 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187047 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187051 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187056 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187060 2574 flags.go:64] FLAG: --lock-file="" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187064 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187069 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187076 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187085 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187090 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187095 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 21:47:10.192637 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187099 2574 flags.go:64] FLAG: --logging-format="text" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187104 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187109 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187114 2574 flags.go:64] FLAG: --manifest-url="" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187119 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187126 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187132 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187139 2574 flags.go:64] FLAG: --max-pods="110" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187144 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187148 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187153 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187158 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187163 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187167 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187172 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187184 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187188 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187193 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187198 2574 flags.go:64] FLAG: --pod-cidr="" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187203 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187213 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187218 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187223 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187227 2574 flags.go:64] FLAG: --port="10250" Apr 20 21:47:10.193254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187232 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187237 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0cca9cd8947160844" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187242 2574 flags.go:64] FLAG: --qos-reserved="" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187247 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187252 2574 flags.go:64] FLAG: --register-node="true" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187259 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187264 2574 flags.go:64] FLAG: --register-with-taints="" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187270 2574 flags.go:64] FLAG: --registry-burst="10" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187275 2574 flags.go:64] FLAG: --registry-qps="5" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187280 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187285 2574 flags.go:64] FLAG: --reserved-memory="" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187295 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187300 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187305 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187310 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187320 2574 flags.go:64] FLAG: --runonce="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187325 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187330 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187335 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187340 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187344 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187349 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187355 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187360 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187365 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187387 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 21:47:10.193922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187391 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187396 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187403 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187408 2574 flags.go:64] FLAG: --system-cgroups="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187413 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187422 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187427 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187431 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187442 2574 flags.go:64] FLAG: --tls-min-version="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187446 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187451 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187457 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187463 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187467 2574 flags.go:64] FLAG: --v="2" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187474 2574 flags.go:64] FLAG: --version="false" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187480 2574 flags.go:64] FLAG: --vmodule="" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187487 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.187493 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187640 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187646 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187650 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187654 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187660 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:10.194700 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187665 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187669 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187673 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187678 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187682 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187686 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187690 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187694 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187699 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187703 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187708 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187715 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187719 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187723 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187727 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187734 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187740 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187745 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187750 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187754 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:10.195301 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187759 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187764 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187768 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187772 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187776 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187780 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187785 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187789 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187794 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187798 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187803 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187807 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187812 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187816 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187820 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187825 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187829 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187833 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187838 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187842 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:10.195873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187846 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187852 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187858 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187864 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187869 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187873 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187878 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187882 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187886 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187890 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187895 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187899 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187905 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187909 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187913 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187917 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187921 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187925 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187929 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:10.196427 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187933 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187938 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187942 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187946 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187950 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187955 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187960 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187964 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187968 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187972 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187976 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187981 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187985 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187989 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187993 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.187997 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188003 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188007 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188011 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188015 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:10.196982 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188020 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.188024 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.188041 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.194523 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.194539 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194584 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194589 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194592 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194595 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194598 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194601 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194604 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194607 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194609 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194612 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:10.197499 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194615 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194618 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194620 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194623 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194625 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194628 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194631 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194633 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194636 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194639 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194641 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194644 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194647 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194649 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194651 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194654 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194657 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194659 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194662 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194664 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:10.197879 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194667 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194672 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194675 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194678 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194680 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194683 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194686 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194688 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194691 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194694 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194698 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194702 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194704 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194707 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194710 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194712 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194715 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194718 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194720 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:10.198383 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194723 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194725 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194728 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194730 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194733 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194736 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194738 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194741 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194743 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194746 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194748 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194751 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194753 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194756 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194759 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194762 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194765 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194768 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194771 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194773 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:10.198862 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194776 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194778 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194781 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194783 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194786 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194788 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194791 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194795 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194799 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194803 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194806 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194808 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194811 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194814 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194816 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194818 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:10.199506 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194821 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.194826 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194928 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194933 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194936 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194939 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194941 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194944 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194947 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194949 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194952 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194955 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194958 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194961 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194964 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 21:47:10.199916 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194966 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194969 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194971 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194974 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194977 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194979 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194982 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194985 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194987 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194990 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194992 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194995 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.194997 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195000 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195002 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195005 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195008 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195010 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195013 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195016 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 21:47:10.200291 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195018 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195020 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195023 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195025 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195028 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195030 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195033 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195036 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195038 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195041 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195044 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195047 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195049 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195052 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195055 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195057 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195060 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195062 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195064 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195067 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 21:47:10.200792 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195070 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195072 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195074 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195077 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195080 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195082 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195086 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195089 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195092 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195096 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195099 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195103 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195105 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195108 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195110 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195113 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195115 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195118 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195120 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 21:47:10.201273 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195123 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195126 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195128 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195131 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195134 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195136 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195139 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195141 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195144 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195146 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195149 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195151 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195154 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:10.195156 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.195161 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 21:47:10.201753 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.195847 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 21:47:10.202122 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.197771 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 21:47:10.202122 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.198777 2574 server.go:1019] "Starting client certificate rotation" Apr 20 21:47:10.202122 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.198869 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:47:10.202122 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.199693 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 21:47:10.225475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.225450 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:47:10.230829 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.230810 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 21:47:10.246679 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.246656 2574 log.go:25] "Validated CRI v1 runtime API" Apr 20 21:47:10.252460 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.252440 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:47:10.252656 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.252642 2574 log.go:25] "Validated CRI v1 image API" Apr 20 21:47:10.253932 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.253918 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 21:47:10.256737 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.256707 2574 fs.go:135] Filesystem UUIDs: map[149184de-c6c4-4615-b6d1-cedb17ffc49e:/dev/nvme0n1p3 57b661f4-63dd-438a-8775-2ca3e944652f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 21:47:10.256790 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.256737 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 21:47:10.262363 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.262256 2574 manager.go:217] Machine: {Timestamp:2026-04-20 21:47:10.261087605 +0000 UTC m=+0.421941736 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3115520 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23bfa6a4ba75e0afd0b16dfa66663a SystemUUID:ec23bfa6-a4ba-75e0-afd0-b16dfa66663a BootID:c2d89285-c037-4976-9bd3-52bd5fcee224 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5b:54:c9:1a:23 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5b:54:c9:1a:23 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:88:bc:a2:3b:f7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 21:47:10.262363 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.262358 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 21:47:10.262495 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.262483 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 21:47:10.264802 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.264777 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 21:47:10.264952 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.264805 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-110.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 21:47:10.265474 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.265465 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 21:47:10.265506 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.265477 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 21:47:10.265506 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.265494 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:47:10.265562 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.265507 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 21:47:10.266941 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.266932 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:47:10.267060 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.267051 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 21:47:10.269806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.269795 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 20 21:47:10.269839 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.269815 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 21:47:10.269839 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.269830 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 21:47:10.269839 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.269839 2574 kubelet.go:397] "Adding apiserver pod source" Apr 20 21:47:10.269957 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.269853 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 21:47:10.271084 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.271069 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:47:10.271165 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.271088 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 21:47:10.274044 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.274023 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 21:47:10.276112 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.276099 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 21:47:10.277687 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277666 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 21:47:10.277687 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277683 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 21:47:10.277687 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277690 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277696 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277703 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277709 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277714 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277719 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277726 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277732 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277746 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 21:47:10.277808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.277755 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 21:47:10.278570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.278558 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 21:47:10.278570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.278568 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 21:47:10.282311 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.282178 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 21:47:10.282398 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.282335 2574 server.go:1295] "Started kubelet" Apr 20 21:47:10.282522 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.282448 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 21:47:10.284675 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.284648 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-110.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 21:47:10.284911 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.284860 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 21:47:10.284961 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.284943 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 21:47:10.285188 ip-10-0-140-110 systemd[1]: Started Kubernetes Kubelet. Apr 20 21:47:10.285528 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.285499 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 21:47:10.286063 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.286041 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 21:47:10.286675 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.286655 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 21:47:10.288506 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.288489 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 20 21:47:10.290669 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.290649 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zsg4w" Apr 20 21:47:10.291050 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.290090 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-110.ec2.internal.18a82ee8c5bae065 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-110.ec2.internal,UID:ip-10-0-140-110.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-110.ec2.internal,},FirstTimestamp:2026-04-20 21:47:10.282309733 +0000 UTC m=+0.443163864,LastTimestamp:2026-04-20 21:47:10.282309733 +0000 UTC m=+0.443163864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-110.ec2.internal,}" Apr 20 21:47:10.292572 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.292555 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 21:47:10.292798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.292576 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 21:47:10.293499 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.293481 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 21:47:10.293499 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.293500 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 21:47:10.293656 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.293587 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 21:47:10.293656 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.293630 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 20 21:47:10.293656 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.293637 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 20 21:47:10.294015 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.293992 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.294808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294792 2574 factory.go:153] Registering CRI-O factory Apr 20 21:47:10.294808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294807 2574 factory.go:223] Registration of the crio container factory successfully Apr 20 21:47:10.294958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294852 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 21:47:10.294958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294859 2574 factory.go:55] Registering systemd factory Apr 20 21:47:10.294958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294865 2574 factory.go:223] Registration of the systemd container factory successfully Apr 20 21:47:10.294958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294883 2574 factory.go:103] Registering Raw factory Apr 20 21:47:10.294958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.294895 2574 manager.go:1196] Started watching for new ooms in manager Apr 20 21:47:10.295411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.295394 2574 manager.go:319] Starting recovery of all containers Apr 20 21:47:10.298507 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.298476 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zsg4w" Apr 20 21:47:10.299209 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.299184 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-110.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 21:47:10.299297 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.299284 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 21:47:10.304194 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.304178 2574 manager.go:324] Recovery completed Apr 20 21:47:10.309364 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.309352 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.311565 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.311551 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.311635 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.311579 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.311635 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.311590 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.312053 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.312028 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 21:47:10.312053 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.312041 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 21:47:10.312053 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.312055 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 21:47:10.314293 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.314282 2574 policy_none.go:49] "None policy: Start" Apr 20 21:47:10.314343 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.314299 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 21:47:10.314343 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.314309 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 20 21:47:10.357988 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.357967 2574 manager.go:341] "Starting Device Plugin manager" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.357999 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358010 2574 server.go:85] "Starting device plugin registration server" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358209 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358223 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358317 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358406 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.358415 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.358896 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 21:47:10.359037 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.358927 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.425196 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.425161 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 21:47:10.426366 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.426346 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 21:47:10.426476 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.426387 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 21:47:10.426476 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.426408 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 21:47:10.426476 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.426414 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 21:47:10.426476 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.426449 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 21:47:10.428798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.428775 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:10.458851 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.458780 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.460117 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.460101 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.460193 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.460131 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.460193 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.460141 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.460193 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.460165 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.469464 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.469446 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.469512 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.469467 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-110.ec2.internal\": node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.482852 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.482830 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.527269 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.527237 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal"] Apr 20 21:47:10.527356 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.527309 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.528134 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.528119 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.528178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.528148 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.528178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.528159 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.529444 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.529432 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.529609 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.529595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.529648 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.529630 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.530132 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530118 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.530178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530118 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.530178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530148 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.530178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530160 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.530178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530168 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.530283 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.530182 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.531333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.531319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.531417 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.531342 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 21:47:10.532069 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.532049 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientMemory" Apr 20 21:47:10.532146 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.532084 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 21:47:10.532146 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.532102 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeHasSufficientPID" Apr 20 21:47:10.555958 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.555938 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-110.ec2.internal\" not found" node="ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.560390 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.560361 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-110.ec2.internal\" not found" node="ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.583486 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.583464 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.594008 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.593979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27ef09a0806330265539f634ef8e0e80-config\") pod \"kube-apiserver-proxy-ip-10-0-140-110.ec2.internal\" (UID: \"27ef09a0806330265539f634ef8e0e80\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.594083 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.594013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.594083 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.594039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.684417 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.684360 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.694676 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694653 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.694730 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.694730 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27ef09a0806330265539f634ef8e0e80-config\") pod \"kube-apiserver-proxy-ip-10-0-140-110.ec2.internal\" (UID: \"27ef09a0806330265539f634ef8e0e80\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.694788 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.694820 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/27ef09a0806330265539f634ef8e0e80-config\") pod \"kube-apiserver-proxy-ip-10-0-140-110.ec2.internal\" (UID: \"27ef09a0806330265539f634ef8e0e80\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.694820 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.694756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/dc8fe02abf3a02e07cd912b3d9bc28a2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal\" (UID: \"dc8fe02abf3a02e07cd912b3d9bc28a2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.785113 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.785047 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.858538 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.858513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.863155 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:10.863135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:10.885790 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.885765 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:10.986321 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:10.986285 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.086891 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.086814 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.187310 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.187279 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.198747 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.198720 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 21:47:11.198881 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.198866 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 21:47:11.288013 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.287984 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.293873 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.293851 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 21:47:11.300342 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.300294 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 21:42:10 +0000 UTC" deadline="2028-01-01 12:31:37.80894776 +0000 UTC" Apr 20 21:47:11.300342 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.300340 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14894h44m26.508612686s" Apr 20 21:47:11.303551 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.303536 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 21:47:11.322831 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.322809 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2x57l" Apr 20 21:47:11.330469 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.330446 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2x57l" Apr 20 21:47:11.366330 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.366305 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.388802 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.388774 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.423760 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.423736 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.433406 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:11.433355 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc8fe02abf3a02e07cd912b3d9bc28a2.slice/crio-32a22a3010dbbe98e3ed3e9dd7cee80ef1fc44e5427b44d12216d1591b6f3c74 WatchSource:0}: Error finding container 32a22a3010dbbe98e3ed3e9dd7cee80ef1fc44e5427b44d12216d1591b6f3c74: Status 404 returned error can't find the container with id 32a22a3010dbbe98e3ed3e9dd7cee80ef1fc44e5427b44d12216d1591b6f3c74 Apr 20 21:47:11.433772 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:11.433750 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27ef09a0806330265539f634ef8e0e80.slice/crio-9e5a4b8dc390853137c1f2a02bd9033196b788e055ae95bd45c580b6df1e4353 WatchSource:0}: Error finding container 9e5a4b8dc390853137c1f2a02bd9033196b788e055ae95bd45c580b6df1e4353: Status 404 returned error can't find the container with id 9e5a4b8dc390853137c1f2a02bd9033196b788e055ae95bd45c580b6df1e4353 Apr 20 21:47:11.438480 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.438467 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:47:11.489469 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.489436 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.590013 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.589978 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.690630 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:11.690459 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-110.ec2.internal\" not found" Apr 20 21:47:11.747266 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.747243 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:11.794279 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.794257 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" Apr 20 21:47:11.810286 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.810268 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:47:11.811799 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.811785 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" Apr 20 21:47:11.820345 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:11.820329 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 21:47:12.181418 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.181363 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 21:47:12.270301 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.270269 2574 apiserver.go:52] "Watching apiserver" Apr 20 21:47:12.275946 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.275920 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 21:47:12.277093 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.277061 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal","openshift-dns/node-resolver-6drvt","openshift-image-registry/node-ca-j5w8c","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal","openshift-multus/network-metrics-daemon-xmrt9","openshift-network-operator/iptables-alerter-8jqdc","kube-system/global-pull-secret-syncer-7524q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj","openshift-cluster-node-tuning-operator/tuned-szpsv","openshift-multus/multus-additional-cni-plugins-fxd5b","openshift-multus/multus-kpm8f","openshift-network-diagnostics/network-check-target-hj6pg","openshift-ovn-kubernetes/ovnkube-node-mlfps","kube-system/konnectivity-agent-tgvx7"] Apr 20 21:47:12.279615 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.279559 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:12.279744 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.279675 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:12.280579 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.280563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.281758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.281735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.281867 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.281849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.282447 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.282428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.282595 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.282574 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sdzrt\"" Apr 20 21:47:12.282662 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.282624 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.282905 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.282888 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.282982 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.282949 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:12.283727 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.283705 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 21:47:12.283817 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.283754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.283917 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.283901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4nmcg\"" Apr 20 21:47:12.284279 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.284102 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.284279 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.284113 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 21:47:12.284279 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.284167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 21:47:12.284491 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.284364 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jjmbf\"" Apr 20 21:47:12.285608 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.285590 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.285781 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.285759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287192 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287286 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287400 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-22dw7\"" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.287798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-wj45z\"" Apr 20 21:47:12.288236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287901 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 21:47:12.288236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.287937 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 21:47:12.288236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.288007 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 21:47:12.288236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.288159 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 21:47:12.288236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.288199 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.288741 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.288580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 21:47:12.288831 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.288784 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 21:47:12.289094 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.289070 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bblwp\"" Apr 20 21:47:12.289198 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.289150 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.289198 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.289159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.289334 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.289217 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.290731 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.290712 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.290863 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.290843 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.290921 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.290897 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.290996 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.290853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r6jqw\"" Apr 20 21:47:12.291853 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.291834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.292562 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.292541 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 21:47:12.292659 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.292559 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 21:47:12.292875 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.292856 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-twf4d\"" Apr 20 21:47:12.292960 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.292931 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 21:47:12.293204 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.293184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.293329 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.293235 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 21:47:12.293329 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.293246 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:12.293329 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.293286 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 21:47:12.293567 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.293552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j7kzm\"" Apr 20 21:47:12.293782 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.293762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 21:47:12.294526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.294509 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 21:47:12.305076 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-kubernetes\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.305164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.305243 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-cnibin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.305243 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwmf\" (UniqueName: \"kubernetes.io/projected/996f85cb-2c59-49bc-b910-fcea18620d93-kube-api-access-rrwmf\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.305320 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305320 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-system-cni-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.305320 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-daemon-config\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.305481 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds9mr\" (UniqueName: \"kubernetes.io/projected/25ce781b-7c4c-499a-bc4a-2efb25261488-kube-api-access-ds9mr\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.305481 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-etc-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305481 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovn-node-metrics-cert\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305481 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mnn6\" (UniqueName: \"kubernetes.io/projected/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-kube-api-access-9mnn6\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305647 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-lib-modules\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.305647 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrnvm\" (UniqueName: \"kubernetes.io/projected/3a1935ff-0056-494d-bd40-1316c97c620f-kube-api-access-qrnvm\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.305647 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-bin\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305647 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.305647 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-sys-fs\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-k8s-cni-cncf-io\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-run\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-hostroot\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-multus-certs\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvmtm\" (UniqueName: \"kubernetes.io/projected/a08eea80-f553-4499-a8dc-94c9591d8221-kube-api-access-xvmtm\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-env-overrides\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-host\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.305848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-netns\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-log-socket\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-slash\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-modprobe-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.305982 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-systemd\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmz7v\" (UniqueName: \"kubernetes.io/projected/088693c1-4b07-48b4-9c28-9cb217da135a-kube-api-access-xmz7v\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306030 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-cnibin\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-os-release\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-systemd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysconfig\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-sys\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-device-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306213 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-bin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-etc-kubernetes\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-netd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a1935ff-0056-494d-bd40-1316c97c620f-hosts-file\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-cni-binary-copy\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25ce781b-7c4c-499a-bc4a-2efb25261488-host\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306447 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-kubelet\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-systemd-units\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-var-lib-kubelet\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/996f85cb-2c59-49bc-b910-fcea18620d93-host-slash\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-netns\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-socket-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt97m\" (UniqueName: \"kubernetes.io/projected/f2958395-eab5-4338-b6d6-170a01a66c73-kube-api-access-pt97m\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.306902 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306868 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-785wg\" (UniqueName: \"kubernetes.io/projected/c36fc8e4-ee32-4959-9150-79a71f56389f-kube-api-access-785wg\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-system-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306919 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-kubelet\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-var-lib-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.306983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-ovn\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-conf\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-socket-dir-parent\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-konnectivity-ca\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/996f85cb-2c59-49bc-b910-fcea18620d93-iptables-alerter-script\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-config\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a1935ff-0056-494d-bd40-1316c97c620f-tmp-dir\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-etc-selinux\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-kubelet-config\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25ce781b-7c4c-499a-bc4a-2efb25261488-serviceca\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307294 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-registration-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.307667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-node-log\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307358 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-etc-tuned\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-tmp\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307455 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdd9c\" (UniqueName: \"kubernetes.io/projected/1cf66444-7265-4d80-80d8-107f0de4d0db-kube-api-access-fdd9c\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-dbus\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-script-lib\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-os-release\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-multus\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-conf-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.308295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.307666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-agent-certs\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.331857 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.331823 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:42:11 +0000 UTC" deadline="2027-11-06 07:31:17.588175384 +0000 UTC" Apr 20 21:47:12.331857 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.331856 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13545h44m5.2563226s" Apr 20 21:47:12.408470 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408429 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-netns\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.408632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.408632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-socket-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.408632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408561 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.408778 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt97m\" (UniqueName: \"kubernetes.io/projected/f2958395-eab5-4338-b6d6-170a01a66c73-kube-api-access-pt97m\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.408778 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408754 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.408934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-netns\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.408934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-785wg\" (UniqueName: \"kubernetes.io/projected/c36fc8e4-ee32-4959-9150-79a71f56389f-kube-api-access-785wg\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.408934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-system-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.408934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-socket-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.408934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408874 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-kubelet\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.408981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-system-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-var-lib-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-kubelet\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-ovn\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-conf\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.409164 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-socket-dir-parent\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-konnectivity-ca\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-socket-dir-parent\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/996f85cb-2c59-49bc-b910-fcea18620d93-iptables-alerter-script\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-var-lib-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysctl-conf\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-config\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-ovn\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a1935ff-0056-494d-bd40-1316c97c620f-tmp-dir\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.409486 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-etc-selinux\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-kubelet-config\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25ce781b-7c4c-499a-bc4a-2efb25261488-serviceca\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-etc-selinux\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-registration-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3a1935ff-0056-494d-bd40-1316c97c620f-tmp-dir\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-node-log\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-kubelet-config\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-registration-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.409885 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409879 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/996f85cb-2c59-49bc-b910-fcea18620d93-iptables-alerter-script\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-etc-tuned\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-node-log\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-konnectivity-ca\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.409995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-config\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-tmp\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdd9c\" (UniqueName: \"kubernetes.io/projected/1cf66444-7265-4d80-80d8-107f0de4d0db-kube-api-access-fdd9c\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/25ce781b-7c4c-499a-bc4a-2efb25261488-serviceca\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-dbus\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-script-lib\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410182 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-os-release\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-multus\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-dbus\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-multus\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.410341 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-os-release\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-conf-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-agent-certs\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-conf-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-kubernetes\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-kubernetes\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.411019 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.410995 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-cnibin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwmf\" (UniqueName: \"kubernetes.io/projected/996f85cb-2c59-49bc-b910-fcea18620d93-kube-api-access-rrwmf\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-system-cni-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-daemon-config\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds9mr\" (UniqueName: \"kubernetes.io/projected/25ce781b-7c4c-499a-bc4a-2efb25261488-kube-api-access-ds9mr\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.411404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-etc-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411739 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovn-node-metrics-cert\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411739 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mnn6\" (UniqueName: \"kubernetes.io/projected/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-kube-api-access-9mnn6\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411755 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-lib-modules\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrnvm\" (UniqueName: \"kubernetes.io/projected/3a1935ff-0056-494d-bd40-1316c97c620f-kube-api-access-qrnvm\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-etc-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-bin\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-cnibin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.411844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-openvswitch\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovnkube-script-lib\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-run-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-system-cni-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f2958395-eab5-4338-b6d6-170a01a66c73-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-daemon-config\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-lib-modules\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.411974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-bin\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-sys-fs\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-k8s-cni-cncf-io\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-run\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-sys-fs\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-hostroot\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-run\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-multus-certs\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-k8s-cni-cncf-io\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-hostroot\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-multus-certs\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvmtm\" (UniqueName: \"kubernetes.io/projected/a08eea80-f553-4499-a8dc-94c9591d8221-kube-api-access-xvmtm\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-env-overrides\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-host\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412340 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-netns\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-host\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412418 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-run-netns\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-log-socket\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-log-socket\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-slash\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-modprobe-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-systemd\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmz7v\" (UniqueName: \"kubernetes.io/projected/088693c1-4b07-48b4-9c28-9cb217da135a-kube-api-access-xmz7v\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-cnibin\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.412977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-systemd\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-os-release\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-env-overrides\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-systemd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysconfig\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-sys\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-device-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412794 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-os-release\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-bin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-slash\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-etc-kubernetes\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-netd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-multus-cni-dir\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f2958395-eab5-4338-b6d6-170a01a66c73-cnibin\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-sys\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412910 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.413758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a1935ff-0056-494d-bd40-1316c97c620f-hosts-file\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412957 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-etc-kubernetes\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3a1935ff-0056-494d-bd40-1316c97c620f-hosts-file\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-cni-netd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.412989 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-modprobe-d\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-cni-binary-copy\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413025 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1cf66444-7265-4d80-80d8-107f0de4d0db-host-var-lib-cni-bin\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c36fc8e4-ee32-4959-9150-79a71f56389f-device-dir\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25ce781b-7c4c-499a-bc4a-2efb25261488-host\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-run-systemd\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-kubelet\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-systemd-units\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25ce781b-7c4c-499a-bc4a-2efb25261488-host\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413182 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-var-lib-kubelet\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/996f85cb-2c59-49bc-b910-fcea18620d93-host-slash\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.413253 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/996f85cb-2c59-49bc-b910-fcea18620d93-host-slash\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.414429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-etc-sysconfig\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413322 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-host-kubelet\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.413336 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:12.913315348 +0000 UTC m=+3.074169492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-systemd-units\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/088693c1-4b07-48b4-9c28-9cb217da135a-var-lib-kubelet\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.413510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1cf66444-7265-4d80-80d8-107f0de4d0db-cni-binary-copy\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.413776 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.413869 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:12.913854701 +0000 UTC m=+3.074708823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.414942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-ovn-node-metrics-cert\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.415294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.415185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7548fad1-54fd-45fb-87f3-3c9b7d8d2573-agent-certs\") pod \"konnectivity-agent-tgvx7\" (UID: \"7548fad1-54fd-45fb-87f3-3c9b7d8d2573\") " pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.415773 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.415426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-etc-tuned\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.415773 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.415491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/088693c1-4b07-48b4-9c28-9cb217da135a-tmp\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.417386 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.417350 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:12.417505 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.417389 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:12.417505 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.417403 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:12.417505 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.417464 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:12.91744987 +0000 UTC m=+3.078304001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:12.419400 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.419363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt97m\" (UniqueName: \"kubernetes.io/projected/f2958395-eab5-4338-b6d6-170a01a66c73-kube-api-access-pt97m\") pod \"multus-additional-cni-plugins-fxd5b\" (UID: \"f2958395-eab5-4338-b6d6-170a01a66c73\") " pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.419594 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.419568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-785wg\" (UniqueName: \"kubernetes.io/projected/c36fc8e4-ee32-4959-9150-79a71f56389f-kube-api-access-785wg\") pod \"aws-ebs-csi-driver-node-svksj\" (UID: \"c36fc8e4-ee32-4959-9150-79a71f56389f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.423625 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.423590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmz7v\" (UniqueName: \"kubernetes.io/projected/088693c1-4b07-48b4-9c28-9cb217da135a-kube-api-access-xmz7v\") pod \"tuned-szpsv\" (UID: \"088693c1-4b07-48b4-9c28-9cb217da135a\") " pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.423726 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.423642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvmtm\" (UniqueName: \"kubernetes.io/projected/a08eea80-f553-4499-a8dc-94c9591d8221-kube-api-access-xvmtm\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.423963 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.423941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwmf\" (UniqueName: \"kubernetes.io/projected/996f85cb-2c59-49bc-b910-fcea18620d93-kube-api-access-rrwmf\") pod \"iptables-alerter-8jqdc\" (UID: \"996f85cb-2c59-49bc-b910-fcea18620d93\") " pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.424030 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.423985 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mnn6\" (UniqueName: \"kubernetes.io/projected/a3605f9a-a9e1-40d9-ab62-917e4aca6f0c-kube-api-access-9mnn6\") pod \"ovnkube-node-mlfps\" (UID: \"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c\") " pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.424283 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.424262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdd9c\" (UniqueName: \"kubernetes.io/projected/1cf66444-7265-4d80-80d8-107f0de4d0db-kube-api-access-fdd9c\") pod \"multus-kpm8f\" (UID: \"1cf66444-7265-4d80-80d8-107f0de4d0db\") " pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.424643 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.424621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds9mr\" (UniqueName: \"kubernetes.io/projected/25ce781b-7c4c-499a-bc4a-2efb25261488-kube-api-access-ds9mr\") pod \"node-ca-j5w8c\" (UID: \"25ce781b-7c4c-499a-bc4a-2efb25261488\") " pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.425313 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.425280 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrnvm\" (UniqueName: \"kubernetes.io/projected/3a1935ff-0056-494d-bd40-1316c97c620f-kube-api-access-qrnvm\") pod \"node-resolver-6drvt\" (UID: \"3a1935ff-0056-494d-bd40-1316c97c620f\") " pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.431267 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.431224 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" event={"ID":"dc8fe02abf3a02e07cd912b3d9bc28a2","Type":"ContainerStarted","Data":"32a22a3010dbbe98e3ed3e9dd7cee80ef1fc44e5427b44d12216d1591b6f3c74"} Apr 20 21:47:12.432573 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.432498 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" event={"ID":"27ef09a0806330265539f634ef8e0e80","Type":"ContainerStarted","Data":"9e5a4b8dc390853137c1f2a02bd9033196b788e055ae95bd45c580b6df1e4353"} Apr 20 21:47:12.592754 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.592721 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6drvt" Apr 20 21:47:12.599518 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.599497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j5w8c" Apr 20 21:47:12.608378 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.608349 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:12.614985 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.614965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8jqdc" Apr 20 21:47:12.622570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.622549 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:12.637208 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.637184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" Apr 20 21:47:12.644856 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.644834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-szpsv" Apr 20 21:47:12.651419 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.651401 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" Apr 20 21:47:12.656982 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.656965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kpm8f" Apr 20 21:47:12.915744 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.915712 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:12.915931 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:12.915771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:12.915931 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.915810 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.915931 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.915883 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:12.915931 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.915884 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:13.915866683 +0000 UTC m=+4.076720823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:12.915931 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:12.915923 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:13.915913026 +0000 UTC m=+4.076767144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:13.016320 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.016291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:13.016485 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.016461 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:13.016485 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.016479 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:13.016553 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.016489 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:13.016553 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.016534 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:14.01652153 +0000 UTC m=+4.177375653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:13.058541 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.058510 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3605f9a_a9e1_40d9_ab62_917e4aca6f0c.slice/crio-9d5e65463e92b926a83c74899101403a09884ea7950d5d0f3a4f1e4fbcf0ff52 WatchSource:0}: Error finding container 9d5e65463e92b926a83c74899101403a09884ea7950d5d0f3a4f1e4fbcf0ff52: Status 404 returned error can't find the container with id 9d5e65463e92b926a83c74899101403a09884ea7950d5d0f3a4f1e4fbcf0ff52 Apr 20 21:47:13.059683 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.059660 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cf66444_7265_4d80_80d8_107f0de4d0db.slice/crio-cfde2bf743226d43b9514666090ad52d65d0793bbe6bd7400921cd3303069ec3 WatchSource:0}: Error finding container cfde2bf743226d43b9514666090ad52d65d0793bbe6bd7400921cd3303069ec3: Status 404 returned error can't find the container with id cfde2bf743226d43b9514666090ad52d65d0793bbe6bd7400921cd3303069ec3 Apr 20 21:47:13.060975 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.060953 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc36fc8e4_ee32_4959_9150_79a71f56389f.slice/crio-71c1c9158213c7c1b7741a834284e25ebfa6fb126091319e80c47202c90b2995 WatchSource:0}: Error finding container 71c1c9158213c7c1b7741a834284e25ebfa6fb126091319e80c47202c90b2995: Status 404 returned error can't find the container with id 71c1c9158213c7c1b7741a834284e25ebfa6fb126091319e80c47202c90b2995 Apr 20 21:47:13.065413 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.065391 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7548fad1_54fd_45fb_87f3_3c9b7d8d2573.slice/crio-6c10eedb9b6a843f9eae82bacb4e2d67f14aff2980a3ba6f913a09104526dcb2 WatchSource:0}: Error finding container 6c10eedb9b6a843f9eae82bacb4e2d67f14aff2980a3ba6f913a09104526dcb2: Status 404 returned error can't find the container with id 6c10eedb9b6a843f9eae82bacb4e2d67f14aff2980a3ba6f913a09104526dcb2 Apr 20 21:47:13.066099 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.066078 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod088693c1_4b07_48b4_9c28_9cb217da135a.slice/crio-58446e9c716daed960c227ed82c4f7860b30cdf07a3771f21442c5e6304658d3 WatchSource:0}: Error finding container 58446e9c716daed960c227ed82c4f7860b30cdf07a3771f21442c5e6304658d3: Status 404 returned error can't find the container with id 58446e9c716daed960c227ed82c4f7860b30cdf07a3771f21442c5e6304658d3 Apr 20 21:47:13.066974 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.066952 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25ce781b_7c4c_499a_bc4a_2efb25261488.slice/crio-7cf8f49609fa41018095b9dcc68b4435f5d2c036768a405750aaed1088b33144 WatchSource:0}: Error finding container 7cf8f49609fa41018095b9dcc68b4435f5d2c036768a405750aaed1088b33144: Status 404 returned error can't find the container with id 7cf8f49609fa41018095b9dcc68b4435f5d2c036768a405750aaed1088b33144 Apr 20 21:47:13.068050 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.068029 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996f85cb_2c59_49bc_b910_fcea18620d93.slice/crio-1f22dfc4c0d6de4e938c9c748278581146d15a27e328eff9d2fb554b08807a65 WatchSource:0}: Error finding container 1f22dfc4c0d6de4e938c9c748278581146d15a27e328eff9d2fb554b08807a65: Status 404 returned error can't find the container with id 1f22dfc4c0d6de4e938c9c748278581146d15a27e328eff9d2fb554b08807a65 Apr 20 21:47:13.069342 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.069264 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2958395_eab5_4338_b6d6_170a01a66c73.slice/crio-60abdd7bb5099455c6591500e08f4de3a16d92505c6d85a37e8288e31f857b51 WatchSource:0}: Error finding container 60abdd7bb5099455c6591500e08f4de3a16d92505c6d85a37e8288e31f857b51: Status 404 returned error can't find the container with id 60abdd7bb5099455c6591500e08f4de3a16d92505c6d85a37e8288e31f857b51 Apr 20 21:47:13.071968 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:13.071260 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a1935ff_0056_494d_bd40_1316c97c620f.slice/crio-8daee6dc3b2a285512bc4a3970f1905306d9afb6c652d3947931d8946a4127ae WatchSource:0}: Error finding container 8daee6dc3b2a285512bc4a3970f1905306d9afb6c652d3947931d8946a4127ae: Status 404 returned error can't find the container with id 8daee6dc3b2a285512bc4a3970f1905306d9afb6c652d3947931d8946a4127ae Apr 20 21:47:13.332457 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.332335 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 21:42:11 +0000 UTC" deadline="2027-09-19 07:06:56.602232505 +0000 UTC" Apr 20 21:47:13.332457 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.332384 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12393h19m43.26986615s" Apr 20 21:47:13.441242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.441206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5w8c" event={"ID":"25ce781b-7c4c-499a-bc4a-2efb25261488","Type":"ContainerStarted","Data":"7cf8f49609fa41018095b9dcc68b4435f5d2c036768a405750aaed1088b33144"} Apr 20 21:47:13.444681 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.444608 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tgvx7" event={"ID":"7548fad1-54fd-45fb-87f3-3c9b7d8d2573","Type":"ContainerStarted","Data":"6c10eedb9b6a843f9eae82bacb4e2d67f14aff2980a3ba6f913a09104526dcb2"} Apr 20 21:47:13.446742 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.446683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"9d5e65463e92b926a83c74899101403a09884ea7950d5d0f3a4f1e4fbcf0ff52"} Apr 20 21:47:13.449528 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.449478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-szpsv" event={"ID":"088693c1-4b07-48b4-9c28-9cb217da135a","Type":"ContainerStarted","Data":"58446e9c716daed960c227ed82c4f7860b30cdf07a3771f21442c5e6304658d3"} Apr 20 21:47:13.460351 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.460323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" event={"ID":"c36fc8e4-ee32-4959-9150-79a71f56389f","Type":"ContainerStarted","Data":"71c1c9158213c7c1b7741a834284e25ebfa6fb126091319e80c47202c90b2995"} Apr 20 21:47:13.462501 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.462402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kpm8f" event={"ID":"1cf66444-7265-4d80-80d8-107f0de4d0db","Type":"ContainerStarted","Data":"cfde2bf743226d43b9514666090ad52d65d0793bbe6bd7400921cd3303069ec3"} Apr 20 21:47:13.467815 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.467765 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" event={"ID":"27ef09a0806330265539f634ef8e0e80","Type":"ContainerStarted","Data":"865c09c143b587cd9b004579aa21778d82d7dbafa4884285a8a47c147564e0da"} Apr 20 21:47:13.481065 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.481017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6drvt" event={"ID":"3a1935ff-0056-494d-bd40-1316c97c620f","Type":"ContainerStarted","Data":"8daee6dc3b2a285512bc4a3970f1905306d9afb6c652d3947931d8946a4127ae"} Apr 20 21:47:13.488733 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.488680 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-110.ec2.internal" podStartSLOduration=2.48866585 podStartE2EDuration="2.48866585s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:13.488576018 +0000 UTC m=+3.649430160" watchObservedRunningTime="2026-04-20 21:47:13.48866585 +0000 UTC m=+3.649519989" Apr 20 21:47:13.495928 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.495793 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerStarted","Data":"60abdd7bb5099455c6591500e08f4de3a16d92505c6d85a37e8288e31f857b51"} Apr 20 21:47:13.499232 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.499202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8jqdc" event={"ID":"996f85cb-2c59-49bc-b910-fcea18620d93","Type":"ContainerStarted","Data":"1f22dfc4c0d6de4e938c9c748278581146d15a27e328eff9d2fb554b08807a65"} Apr 20 21:47:13.924006 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.923960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:13.924185 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:13.924028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:13.924185 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.924162 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:13.924304 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.924225 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:15.924207096 +0000 UTC m=+6.085061220 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:13.924680 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.924659 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:13.924774 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:13.924715 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:15.924699771 +0000 UTC m=+6.085553894 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:14.025105 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.024470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:14.025105 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.024663 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:14.025105 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.024682 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:14.025105 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.024695 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:14.025105 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.024754 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:16.024736537 +0000 UTC m=+6.185590662 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:14.429276 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.429240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:14.429763 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.429397 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:14.429828 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.429796 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:14.430650 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.429894 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:14.430650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.429968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:14.430650 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:14.430045 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:14.529965 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.529267 2574 generic.go:358] "Generic (PLEG): container finished" podID="dc8fe02abf3a02e07cd912b3d9bc28a2" containerID="b321bdf86e795f892b980e553d4382b15af15dbfe77087bfae9c2809568b5093" exitCode=0 Apr 20 21:47:14.529965 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:14.529457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" event={"ID":"dc8fe02abf3a02e07cd912b3d9bc28a2","Type":"ContainerDied","Data":"b321bdf86e795f892b980e553d4382b15af15dbfe77087bfae9c2809568b5093"} Apr 20 21:47:15.537653 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:15.537571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" event={"ID":"dc8fe02abf3a02e07cd912b3d9bc28a2","Type":"ContainerStarted","Data":"b6182634f4a5438ddbbaa85ae9ff09866bd71a9ce1d543e609500977f938d1e9"} Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:15.946144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:15.946206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:15.946364 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:15.946448 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:19.946427357 +0000 UTC m=+10.107281496 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:15.946523 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:15.946933 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:15.946558 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:19.946546888 +0000 UTC m=+10.107401007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:16.047285 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:16.047002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:16.047285 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.047183 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:16.047285 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.047201 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:16.047285 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.047215 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:16.047285 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.047271 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:20.047253752 +0000 UTC m=+10.208107910 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:16.427519 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:16.427489 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:16.427703 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.427608 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:16.428096 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:16.428063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:16.428229 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.428160 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:16.430162 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:16.430141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:16.430284 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:16.430243 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:18.426954 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:18.426920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:18.427526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:18.427037 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:18.427526 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:18.427044 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:18.427526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:18.427081 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:18.427526 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:18.427193 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:18.427526 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:18.427281 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:19.983670 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:19.983631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:19.984184 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:19.983700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:19.984184 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:19.983869 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:19.984184 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:19.983931 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:27.983912547 +0000 UTC m=+18.144766669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:19.984184 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:19.984001 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:19.984184 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:19.984037 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:27.984025529 +0000 UTC m=+18.144879661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:20.084273 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:20.084216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:20.084485 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.084409 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:20.084485 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.084435 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:20.084485 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.084448 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:20.084659 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.084514 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:28.084495735 +0000 UTC m=+18.245349857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:20.428044 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.428154 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:20.428522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.428624 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:20.428663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:20.428752 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:20.428723 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:22.427602 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:22.427562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:22.428066 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:22.427611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:22.428066 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:22.427566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:22.428066 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:22.427699 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:22.428066 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:22.427802 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:22.428066 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:22.427860 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:24.426868 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:24.426833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:24.427302 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:24.426847 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:24.427302 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:24.426938 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:24.427302 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:24.427031 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:24.427302 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:24.427079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:24.427302 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:24.427156 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:26.426943 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:26.426903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:26.427427 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:26.426903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:26.427427 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:26.427047 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:26.427427 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:26.426903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:26.427427 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:26.427114 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:26.427427 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:26.427196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:28.048137 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.047936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:28.048592 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.048162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:28.048592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.048098 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:28.048592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.048271 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:28.048592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.048281 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.048263143 +0000 UTC m=+34.209117270 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:28.048592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.048319 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.048304664 +0000 UTC m=+34.209158781 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:28.149522 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.149482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:28.149680 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.149639 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:28.149680 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.149661 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:28.149680 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.149676 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:28.149808 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.149739 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.14972271 +0000 UTC m=+34.310576846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:28.426949 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.426912 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:28.427124 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.426912 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:28.427124 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.427049 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:28.427124 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:28.426912 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:28.427277 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.427130 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:28.427277 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:28.427183 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:30.428097 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.427888 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:30.428763 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:30.428139 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:30.428763 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.427984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:30.428763 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:30.428236 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:30.428763 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.427954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:30.428763 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:30.428314 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:30.561398 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.561356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6drvt" event={"ID":"3a1935ff-0056-494d-bd40-1316c97c620f","Type":"ContainerStarted","Data":"d2830f7ad352e04758a206e00ea36ce1ee0e1960a6078dd95cc2f89d8c28dd37"} Apr 20 21:47:30.562722 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.562694 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="72c1c8f9f0c5b765ce021a24053be94733707e661f51ad0d82846a46aa7fd497" exitCode=0 Apr 20 21:47:30.562823 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.562779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"72c1c8f9f0c5b765ce021a24053be94733707e661f51ad0d82846a46aa7fd497"} Apr 20 21:47:30.564056 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.564036 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j5w8c" event={"ID":"25ce781b-7c4c-499a-bc4a-2efb25261488","Type":"ContainerStarted","Data":"29d6bb0ddf7b514d1caddbd51b8d5831e5140b32b7c8f136cacc9a667d2b8e25"} Apr 20 21:47:30.565391 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.565336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tgvx7" event={"ID":"7548fad1-54fd-45fb-87f3-3c9b7d8d2573","Type":"ContainerStarted","Data":"db798109b0a8d4a10486a06feda3d9ed9560e4a15b69ab2c1ce0bb490366a53e"} Apr 20 21:47:30.567143 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567127 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:47:30.567445 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567425 2574 generic.go:358] "Generic (PLEG): container finished" podID="a3605f9a-a9e1-40d9-ab62-917e4aca6f0c" containerID="804049d2c6ec61d8bc21663cbbff2981f35e3719117bec7cef1fa3819cb4de55" exitCode=1 Apr 20 21:47:30.567546 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"cf4f6972c53b09a9f3701261ba65e42b5a3df9d3288840f7269592a50de2a3b0"} Apr 20 21:47:30.567546 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"dbd4b6879caa9bb60ca6a5fffd24266ead2406457506fdad140045d59004b82a"} Apr 20 21:47:30.567546 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567501 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerDied","Data":"804049d2c6ec61d8bc21663cbbff2981f35e3719117bec7cef1fa3819cb4de55"} Apr 20 21:47:30.567546 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.567516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"c438af04344a8b2b2739c09e5a0c1cfcd50fa5d88ad75330eda3fba0fbd229df"} Apr 20 21:47:30.568734 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.568714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-szpsv" event={"ID":"088693c1-4b07-48b4-9c28-9cb217da135a","Type":"ContainerStarted","Data":"380814e310224c0cd24034e609dfe548e3ee92583139963ca75752f798ac599a"} Apr 20 21:47:30.569993 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.569971 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" event={"ID":"c36fc8e4-ee32-4959-9150-79a71f56389f","Type":"ContainerStarted","Data":"bdf9fe69f91218b593803703183883370012ee030eedf090d805326eaa9a6583"} Apr 20 21:47:30.571079 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.571063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kpm8f" event={"ID":"1cf66444-7265-4d80-80d8-107f0de4d0db","Type":"ContainerStarted","Data":"f8e07933fd86c4b546e09493ec72bf352caac961a49e130093a41a3ed425c612"} Apr 20 21:47:30.574649 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.572783 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6drvt" podStartSLOduration=4.007452901 podStartE2EDuration="20.572769035s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.073586658 +0000 UTC m=+3.234440792" lastFinishedPulling="2026-04-20 21:47:29.638902797 +0000 UTC m=+19.799756926" observedRunningTime="2026-04-20 21:47:30.572430969 +0000 UTC m=+20.733285093" watchObservedRunningTime="2026-04-20 21:47:30.572769035 +0000 UTC m=+20.733623177" Apr 20 21:47:30.574649 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.572971 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-110.ec2.internal" podStartSLOduration=19.572964744 podStartE2EDuration="19.572964744s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:47:15.54982975 +0000 UTC m=+5.710683892" watchObservedRunningTime="2026-04-20 21:47:30.572964744 +0000 UTC m=+20.733818885" Apr 20 21:47:30.583776 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.583741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tgvx7" podStartSLOduration=4.012138557 podStartE2EDuration="20.583729624s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.067318947 +0000 UTC m=+3.228173073" lastFinishedPulling="2026-04-20 21:47:29.638910019 +0000 UTC m=+19.799764140" observedRunningTime="2026-04-20 21:47:30.583365197 +0000 UTC m=+20.744219337" watchObservedRunningTime="2026-04-20 21:47:30.583729624 +0000 UTC m=+20.744583763" Apr 20 21:47:30.607301 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.607263 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j5w8c" podStartSLOduration=4.038182777 podStartE2EDuration="20.60724965s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.069405451 +0000 UTC m=+3.230259579" lastFinishedPulling="2026-04-20 21:47:29.638472327 +0000 UTC m=+19.799326452" observedRunningTime="2026-04-20 21:47:30.606918106 +0000 UTC m=+20.767772245" watchObservedRunningTime="2026-04-20 21:47:30.60724965 +0000 UTC m=+20.768103806" Apr 20 21:47:30.618571 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.618527 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-szpsv" podStartSLOduration=4.048131544 podStartE2EDuration="20.618515573s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.068087101 +0000 UTC m=+3.228941228" lastFinishedPulling="2026-04-20 21:47:29.638471136 +0000 UTC m=+19.799325257" observedRunningTime="2026-04-20 21:47:30.618255724 +0000 UTC m=+20.779109905" watchObservedRunningTime="2026-04-20 21:47:30.618515573 +0000 UTC m=+20.779369753" Apr 20 21:47:30.633066 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.633032 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kpm8f" podStartSLOduration=4.020793119 podStartE2EDuration="20.633021155s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.061550307 +0000 UTC m=+3.222404431" lastFinishedPulling="2026-04-20 21:47:29.673778346 +0000 UTC m=+19.834632467" observedRunningTime="2026-04-20 21:47:30.632860451 +0000 UTC m=+20.793714590" watchObservedRunningTime="2026-04-20 21:47:30.633021155 +0000 UTC m=+20.793875294" Apr 20 21:47:30.735293 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.735261 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:30.735865 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:30.735846 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:31.051254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.051229 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 21:47:31.371671 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.371559 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T21:47:31.051249636Z","UUID":"8a852975-f17a-4ff2-8a47-df5a66d09aea","Handler":null,"Name":"","Endpoint":""} Apr 20 21:47:31.373980 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.373956 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 21:47:31.373980 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.373981 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 21:47:31.574642 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.574605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8jqdc" event={"ID":"996f85cb-2c59-49bc-b910-fcea18620d93","Type":"ContainerStarted","Data":"9be8c748ec5953515989d85f6490acd1328902ff3cbfb46aa9bc8391fe0a5a49"} Apr 20 21:47:31.577817 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.577785 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:47:31.578242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.578215 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"68e3a0a2756f674279b4a47cc5292707cba1e15a7ea7abc9bac4d463be3a0022"} Apr 20 21:47:31.578355 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.578247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"a854ecdd1876eec97c5c8df84e845c40c8c27821d432b77165913df873c06a17"} Apr 20 21:47:31.580318 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.580173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" event={"ID":"c36fc8e4-ee32-4959-9150-79a71f56389f","Type":"ContainerStarted","Data":"98bccee49e89bd494f866b97a017b330f544ebacffa702cec08234ee2b669fd4"} Apr 20 21:47:31.587201 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.587155 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8jqdc" podStartSLOduration=5.021890868 podStartE2EDuration="21.587140196s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.073200523 +0000 UTC m=+3.234054653" lastFinishedPulling="2026-04-20 21:47:29.638449846 +0000 UTC m=+19.799303981" observedRunningTime="2026-04-20 21:47:31.586515369 +0000 UTC m=+21.747369510" watchObservedRunningTime="2026-04-20 21:47:31.587140196 +0000 UTC m=+21.747994339" Apr 20 21:47:31.974079 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.974044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:31.974801 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:31.974694 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tgvx7" Apr 20 21:47:32.426789 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:32.426751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:32.426789 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:32.426771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:32.427040 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:32.426751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:32.427040 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:32.426868 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:32.427040 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:32.426955 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:32.427170 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:32.427054 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:32.584549 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:32.584509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" event={"ID":"c36fc8e4-ee32-4959-9150-79a71f56389f","Type":"ContainerStarted","Data":"e32b048d286c8269c6a5f46e4277ca8494adf580b90807457b3bb00066ff1286"} Apr 20 21:47:32.597939 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:32.597888 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-svksj" podStartSLOduration=3.661940869 podStartE2EDuration="22.597873876s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.063606238 +0000 UTC m=+3.224460369" lastFinishedPulling="2026-04-20 21:47:31.999539255 +0000 UTC m=+22.160393376" observedRunningTime="2026-04-20 21:47:32.597542759 +0000 UTC m=+22.758396899" watchObservedRunningTime="2026-04-20 21:47:32.597873876 +0000 UTC m=+22.758728016" Apr 20 21:47:33.589179 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:33.589151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:47:33.589764 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:33.589600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"e2036aa0b36a9baa783d9c73cf616fc55e2a5c7b089fdc5d45475d568983fdbc"} Apr 20 21:47:34.426824 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:34.426797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:34.426998 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:34.426797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:34.426998 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:34.426920 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:34.427117 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:34.427020 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:34.427117 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:34.427014 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:34.427206 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:34.427134 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:35.594463 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.594191 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="b8e5228939533f603a0c682b52de108c55cd94efdf0b0d6997d06cac724e2666" exitCode=0 Apr 20 21:47:35.595084 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.594275 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"b8e5228939533f603a0c682b52de108c55cd94efdf0b0d6997d06cac724e2666"} Apr 20 21:47:35.597421 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.597403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:47:35.597738 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.597717 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"7df9b59cd4af1bde670d1d2f20999163d444813d397a9653311451c264b0003d"} Apr 20 21:47:35.598047 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.598014 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:35.598047 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.598042 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:35.598199 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.598134 2574 scope.go:117] "RemoveContainer" containerID="804049d2c6ec61d8bc21663cbbff2981f35e3719117bec7cef1fa3819cb4de55" Apr 20 21:47:35.613052 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:35.613032 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:36.427546 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.427475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:36.427662 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.427475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:36.427662 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:36.427570 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:36.427662 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.427474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:36.427662 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:36.427649 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:36.427787 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:36.427755 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:36.602765 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.602741 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:47:36.603169 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.603142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" event={"ID":"a3605f9a-a9e1-40d9-ab62-917e4aca6f0c","Type":"ContainerStarted","Data":"4c06f44d7a9dfbfe20c339d3d03ba201267a2d3cecadacafe5d268974404011c"} Apr 20 21:47:36.603465 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.603388 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:36.605222 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.605196 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="ba0132a07fcf1ec9b8561cf2854d5d23b0eeb2267815dce704446e6d910b0941" exitCode=0 Apr 20 21:47:36.605331 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.605235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"ba0132a07fcf1ec9b8561cf2854d5d23b0eeb2267815dce704446e6d910b0941"} Apr 20 21:47:36.621616 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.621587 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:47:36.629985 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:36.629944 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" podStartSLOduration=9.60332735 podStartE2EDuration="26.629931209s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.060338704 +0000 UTC m=+3.221192830" lastFinishedPulling="2026-04-20 21:47:30.086942555 +0000 UTC m=+20.247796689" observedRunningTime="2026-04-20 21:47:36.628584604 +0000 UTC m=+26.789438746" watchObservedRunningTime="2026-04-20 21:47:36.629931209 +0000 UTC m=+26.790785352" Apr 20 21:47:37.073049 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.073015 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hj6pg"] Apr 20 21:47:37.073209 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.073158 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:37.073303 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:37.073261 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:37.076830 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.076652 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7524q"] Apr 20 21:47:37.076830 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.076816 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:37.076999 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:37.076941 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:37.077404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.077364 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xmrt9"] Apr 20 21:47:37.077514 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.077500 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:37.077629 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:37.077608 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:37.609397 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.609350 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="c99118dd2cd73d0ecdcba6d17bb018f35f4dd40133fc4ff80d56b03f6a18952c" exitCode=0 Apr 20 21:47:37.609759 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:37.609410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"c99118dd2cd73d0ecdcba6d17bb018f35f4dd40133fc4ff80d56b03f6a18952c"} Apr 20 21:47:38.426710 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:38.426676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:38.426892 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:38.426820 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:39.427144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:39.427110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:39.427144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:39.427130 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:39.427806 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:39.427234 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:39.427806 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:39.427337 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:40.429762 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:40.429729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:40.430269 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:40.429850 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:41.427361 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:41.427332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:41.427511 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:41.427473 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:41.427511 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:41.427490 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:41.427639 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:41.427613 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:42.427383 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:42.427135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:42.427843 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:42.427513 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:47:43.426919 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.426900 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:43.427052 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.426903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:43.427052 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.426993 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7524q" podUID="f57a85f5-bd23-4292-9e22-6f0078a7e4f0" Apr 20 21:47:43.427117 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.427082 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hj6pg" podUID="fe4be124-58a1-4591-b319-21b9bcd1aae4" Apr 20 21:47:43.620578 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.620548 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-110.ec2.internal" event="NodeReady" Apr 20 21:47:43.621021 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.620683 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 21:47:43.622799 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.622774 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="898053f4cf25e32dfe243f545086e9c277d100ff448e78fbdc34bb8dd0fa587e" exitCode=0 Apr 20 21:47:43.622914 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.622805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"898053f4cf25e32dfe243f545086e9c277d100ff448e78fbdc34bb8dd0fa587e"} Apr 20 21:47:43.657541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.657514 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:47:43.661623 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.661602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.663745 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.663616 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 21:47:43.663901 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.663852 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 21:47:43.663977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.663934 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rxx7z\"" Apr 20 21:47:43.664092 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.664075 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 21:47:43.668610 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.668339 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w"] Apr 20 21:47:43.672757 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.672225 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 21:47:43.675119 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.675028 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk"] Apr 20 21:47:43.675270 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.675212 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.678238 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.678214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 21:47:43.678462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.678440 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 21:47:43.678462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.678458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 21:47:43.678612 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.678563 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-b8zzs\"" Apr 20 21:47:43.678724 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.678709 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 21:47:43.681392 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.681302 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm"] Apr 20 21:47:43.685794 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.684357 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vb97h"] Apr 20 21:47:43.685794 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.685247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.688509 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.688489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 21:47:43.691319 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.690487 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:47:43.691319 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.690516 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gm5l7"] Apr 20 21:47:43.691319 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.690598 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.691319 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.690723 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.692431 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 21:47:43.692529 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 21:47:43.692529 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692489 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 21:47:43.692783 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692766 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 21:47:43.692899 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692884 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-s4j5s\"" Apr 20 21:47:43.692972 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.692953 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 21:47:43.693075 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.693058 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 21:47:43.694291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.693994 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w"] Apr 20 21:47:43.694291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.694022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm"] Apr 20 21:47:43.694291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.694040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk"] Apr 20 21:47:43.694291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.694051 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gm5l7"] Apr 20 21:47:43.694291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.694146 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.695348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.695244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vb97h"] Apr 20 21:47:43.695906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.695886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 21:47:43.696014 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.695996 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 21:47:43.696300 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.696280 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bqmdj\"" Apr 20 21:47:43.696490 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.696316 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 21:47:43.760294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760335 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxx5r\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27mt8\" (UniqueName: \"kubernetes.io/projected/2294a10e-a1dc-4759-a425-e047c7157139-kube-api-access-27mt8\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.760523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e4c4164-f43b-4dfe-b639-705ced10d164-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.760523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2294a10e-a1dc-4759-a425-e047c7157139-tmp\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.760645 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2294a10e-a1dc-4759-a425-e047c7157139-klusterlet-config\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.760645 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760645 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pqkl\" (UniqueName: \"kubernetes.io/projected/c31613c1-5256-4da7-9941-7c734fc3dce2-kube-api-access-2pqkl\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760645 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760645 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgtf8\" (UniqueName: \"kubernetes.io/projected/8e4c4164-f43b-4dfe-b639-705ced10d164-kube-api-access-hgtf8\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.760785 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c31613c1-5256-4da7-9941-7c734fc3dce2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760785 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760676 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.760785 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760870 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760798 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760870 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.760870 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.760828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.861614 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f2509d8-512e-4191-a295-3e79802650ac-config-volume\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.861740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.861740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27mt8\" (UniqueName: \"kubernetes.io/projected/2294a10e-a1dc-4759-a425-e047c7157139-kube-api-access-27mt8\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.861740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e4c4164-f43b-4dfe-b639-705ced10d164-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.861740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkgd\" (UniqueName: \"kubernetes.io/projected/2f2509d8-512e-4191-a295-3e79802650ac-kube-api-access-rmkgd\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.861934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.861934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2294a10e-a1dc-4759-a425-e047c7157139-tmp\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.861934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24gwm\" (UniqueName: \"kubernetes.io/projected/bf0d97d7-a0a1-4f99-802a-39ac411ff714-kube-api-access-24gwm\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.861934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2294a10e-a1dc-4759-a425-e047c7157139-klusterlet-config\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.861934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pqkl\" (UniqueName: \"kubernetes.io/projected/c31613c1-5256-4da7-9941-7c734fc3dce2-kube-api-access-2pqkl\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.861981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgtf8\" (UniqueName: \"kubernetes.io/projected/8e4c4164-f43b-4dfe-b639-705ced10d164-kube-api-access-hgtf8\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c31613c1-5256-4da7-9941-7c734fc3dce2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862089 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.862140 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2294a10e-a1dc-4759-a425-e047c7157139-tmp\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.862277 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.862294 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.862358 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.362339167 +0000 UTC m=+34.523193288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxx5r\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.862462 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f2509d8-512e-4191-a295-3e79802650ac-tmp-dir\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.863038 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.862649 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.863748 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.863536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.863877 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.863779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.863961 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.863941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c31613c1-5256-4da7-9941-7c734fc3dce2-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.866769 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.866744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-ca\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.866892 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.866761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.866892 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.866785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.866892 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.866834 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.867045 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.866931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.867102 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.867085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8e4c4164-f43b-4dfe-b639-705ced10d164-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.867323 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.867299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c31613c1-5256-4da7-9941-7c734fc3dce2-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.868041 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.868024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2294a10e-a1dc-4759-a425-e047c7157139-klusterlet-config\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.872696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.872675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pqkl\" (UniqueName: \"kubernetes.io/projected/c31613c1-5256-4da7-9941-7c734fc3dce2-kube-api-access-2pqkl\") pod \"cluster-proxy-proxy-agent-799c57c576-2mgtk\" (UID: \"c31613c1-5256-4da7-9941-7c734fc3dce2\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:43.873414 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.873227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxx5r\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.874049 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.873777 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27mt8\" (UniqueName: \"kubernetes.io/projected/2294a10e-a1dc-4759-a425-e047c7157139-kube-api-access-27mt8\") pod \"klusterlet-addon-workmgr-758cb8bfc4-ngvgm\" (UID: \"2294a10e-a1dc-4759-a425-e047c7157139\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:43.876738 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.874682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgtf8\" (UniqueName: \"kubernetes.io/projected/8e4c4164-f43b-4dfe-b639-705ced10d164-kube-api-access-hgtf8\") pod \"managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w\" (UID: \"8e4c4164-f43b-4dfe-b639-705ced10d164\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:43.877497 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.877478 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:43.963292 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f2509d8-512e-4191-a295-3e79802650ac-config-volume\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.963292 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.963292 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkgd\" (UniqueName: \"kubernetes.io/projected/2f2509d8-512e-4191-a295-3e79802650ac-kube-api-access-rmkgd\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.963513 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24gwm\" (UniqueName: \"kubernetes.io/projected/bf0d97d7-a0a1-4f99-802a-39ac411ff714-kube-api-access-24gwm\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.963513 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.963417 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:43.963513 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.963494 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.463473883 +0000 UTC m=+34.624328005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:47:43.963626 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.963626 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f2509d8-512e-4191-a295-3e79802650ac-tmp-dir\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.963692 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.963677 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:43.963764 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:43.963750 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:47:44.463732055 +0000 UTC m=+34.624586176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:43.963828 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f2509d8-512e-4191-a295-3e79802650ac-config-volume\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.963919 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.963903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f2509d8-512e-4191-a295-3e79802650ac-tmp-dir\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.972290 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.972272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkgd\" (UniqueName: \"kubernetes.io/projected/2f2509d8-512e-4191-a295-3e79802650ac-kube-api-access-rmkgd\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:43.972493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.972475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24gwm\" (UniqueName: \"kubernetes.io/projected/bf0d97d7-a0a1-4f99-802a-39ac411ff714-kube-api-access-24gwm\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:43.996509 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:43.996494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" Apr 20 21:47:44.006191 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.006166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:44.013912 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.013891 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.064010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.064072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.064257 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.064321 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret podName:f57a85f5-bd23-4292-9e22-6f0078a7e4f0 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:16.06430262 +0000 UTC m=+66.225156737 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret") pod "global-pull-secret-syncer-7524q" (UID: "f57a85f5-bd23-4292-9e22-6f0078a7e4f0") : object "kube-system"/"original-pull-secret" not registered Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.064415 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:44.064831 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.064452 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:16.064440252 +0000 UTC m=+66.225294385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 21:47:44.164326 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.164285 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm"] Apr 20 21:47:44.164794 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.164766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:44.165022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.164997 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 21:47:44.165022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.165021 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 21:47:44.165866 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.165035 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5jc2f for pod openshift-network-diagnostics/network-check-target-hj6pg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:44.165866 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.165101 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f podName:fe4be124-58a1-4591-b319-21b9bcd1aae4 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:16.165079609 +0000 UTC m=+66.325933731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5jc2f" (UniqueName: "kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f") pod "network-check-target-hj6pg" (UID: "fe4be124-58a1-4591-b319-21b9bcd1aae4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 21:47:44.165866 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.165265 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w"] Apr 20 21:47:44.168948 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:44.168919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e4c4164_f43b_4dfe_b639_705ced10d164.slice/crio-4e2a70eae74de784a3af961993b575780a0cde3f46c859356c274e22abec461e WatchSource:0}: Error finding container 4e2a70eae74de784a3af961993b575780a0cde3f46c859356c274e22abec461e: Status 404 returned error can't find the container with id 4e2a70eae74de784a3af961993b575780a0cde3f46c859356c274e22abec461e Apr 20 21:47:44.169099 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:44.169084 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2294a10e_a1dc_4759_a425_e047c7157139.slice/crio-358f86841c50aacf98a93438b8a2407df84e9e67239ca6780d12599a34e6329e WatchSource:0}: Error finding container 358f86841c50aacf98a93438b8a2407df84e9e67239ca6780d12599a34e6329e: Status 404 returned error can't find the container with id 358f86841c50aacf98a93438b8a2407df84e9e67239ca6780d12599a34e6329e Apr 20 21:47:44.183212 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.183188 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk"] Apr 20 21:47:44.186440 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:47:44.186421 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31613c1_5256_4da7_9941_7c734fc3dce2.slice/crio-e2788b6414b35c0b1c05b4988674d76cc176517e00fd0fa30eb794591a10f531 WatchSource:0}: Error finding container e2788b6414b35c0b1c05b4988674d76cc176517e00fd0fa30eb794591a10f531: Status 404 returned error can't find the container with id e2788b6414b35c0b1c05b4988674d76cc176517e00fd0fa30eb794591a10f531 Apr 20 21:47:44.366004 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.365974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:44.366150 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.366113 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:44.366150 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.366131 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:44.366233 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.366184 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.366168686 +0000 UTC m=+35.527022804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:44.427328 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.427302 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:47:44.429461 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.429438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:47:44.429601 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.429444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5wzfs\"" Apr 20 21:47:44.467144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.467123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:44.467240 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.467179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:44.467287 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.467260 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:44.467287 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.467262 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:44.467350 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.467304 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.467291383 +0000 UTC m=+35.628145505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:44.467350 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:44.467316 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:45.467310111 +0000 UTC m=+35.628164229 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:47:44.628025 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.627940 2574 generic.go:358] "Generic (PLEG): container finished" podID="f2958395-eab5-4338-b6d6-170a01a66c73" containerID="c89f17ad629d84129994b6f3e1effe3b83945808c502408203833abef755cb2d" exitCode=0 Apr 20 21:47:44.628453 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.628017 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerDied","Data":"c89f17ad629d84129994b6f3e1effe3b83945808c502408203833abef755cb2d"} Apr 20 21:47:44.629062 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.628961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerStarted","Data":"e2788b6414b35c0b1c05b4988674d76cc176517e00fd0fa30eb794591a10f531"} Apr 20 21:47:44.630016 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.629962 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" event={"ID":"2294a10e-a1dc-4759-a425-e047c7157139","Type":"ContainerStarted","Data":"358f86841c50aacf98a93438b8a2407df84e9e67239ca6780d12599a34e6329e"} Apr 20 21:47:44.630908 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:44.630890 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" event={"ID":"8e4c4164-f43b-4dfe-b639-705ced10d164","Type":"ContainerStarted","Data":"4e2a70eae74de784a3af961993b575780a0cde3f46c859356c274e22abec461e"} Apr 20 21:47:45.375113 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.374867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:45.375347 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.375284 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:45.375347 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.375301 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:45.375568 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.375352 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.375339561 +0000 UTC m=+37.536193679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:45.428788 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.427631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:47:45.428788 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.428107 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:47:45.432201 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.431916 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:47:45.432201 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.432015 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:47:45.432201 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.431916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:47:45.432488 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.432274 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hbjh6\"" Apr 20 21:47:45.476353 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.476318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:45.476522 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.476455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:45.477145 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.476627 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:45.477145 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.476689 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.476671191 +0000 UTC m=+37.637525314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:45.477145 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.477064 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:45.477145 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:45.477108 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:47.477092908 +0000 UTC m=+37.637947031 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:47:45.638969 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:45.638886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" event={"ID":"f2958395-eab5-4338-b6d6-170a01a66c73","Type":"ContainerStarted","Data":"fd1e3c6ffc433fea0aa6a8c6a80db5c9853d747696b9afa655bd6e16e79a4253"} Apr 20 21:47:47.393507 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:47.393468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:47.394000 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.393598 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:47.394000 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.393612 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:47.394000 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.393668 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.393650007 +0000 UTC m=+41.554504125 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:47.494267 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:47.494227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:47.494475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:47.494328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:47.494475 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.494410 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:47.494587 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.494471 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:47.494587 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.494480 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.494461269 +0000 UTC m=+41.655315406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:47:47.494587 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:47.494539 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:47:51.494522049 +0000 UTC m=+41.655376180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:50.456281 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.456214 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fxd5b" podStartSLOduration=10.485801606 podStartE2EDuration="40.456193113s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:47:13.07349564 +0000 UTC m=+3.234349775" lastFinishedPulling="2026-04-20 21:47:43.043887164 +0000 UTC m=+33.204741282" observedRunningTime="2026-04-20 21:47:45.671112107 +0000 UTC m=+35.831966251" watchObservedRunningTime="2026-04-20 21:47:50.456193113 +0000 UTC m=+40.617047256" Apr 20 21:47:50.651130 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.651093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerStarted","Data":"81cd3671383b4b9889f50e6974702a7a7dc643d466491f64647b019d66908611"} Apr 20 21:47:50.652167 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.652146 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" event={"ID":"2294a10e-a1dc-4759-a425-e047c7157139","Type":"ContainerStarted","Data":"667863bbeb13f668d1d3366e02860338c6ffcb667ddfd9ea8b060ac3f9fc9498"} Apr 20 21:47:50.652387 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.652353 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:50.653350 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.653325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" event={"ID":"8e4c4164-f43b-4dfe-b639-705ced10d164","Type":"ContainerStarted","Data":"88ae3e0680b990499ea8bd904ad6f8f77d1b39c1c591f9c7c0a34050a5b0c005"} Apr 20 21:47:50.654270 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.654252 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:47:50.666635 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:50.666596 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" podStartSLOduration=18.596919839999998 podStartE2EDuration="24.666584896s" podCreationTimestamp="2026-04-20 21:47:26 +0000 UTC" firstStartedPulling="2026-04-20 21:47:44.170982507 +0000 UTC m=+34.331836624" lastFinishedPulling="2026-04-20 21:47:50.240647558 +0000 UTC m=+40.401501680" observedRunningTime="2026-04-20 21:47:50.665655378 +0000 UTC m=+40.826509519" watchObservedRunningTime="2026-04-20 21:47:50.666584896 +0000 UTC m=+40.827439035" Apr 20 21:47:51.422751 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:51.422716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:51.422904 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.422854 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:51.422904 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.422872 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:51.422984 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.422924 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.422909552 +0000 UTC m=+49.583763674 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:51.523657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:51.523617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:51.524022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:51.523687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:51.524022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.523762 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:51.524022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.523783 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:51.524022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.523825 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.523809027 +0000 UTC m=+49.684663144 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:47:51.524022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:51.523839 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:47:59.523833493 +0000 UTC m=+49.684687611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:54.662236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:54.662195 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerStarted","Data":"823a240f6774e0d5b388a76bdbbe3b9ff8273595f818acc713bc57edcdc86a69"} Apr 20 21:47:54.662236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:54.662239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerStarted","Data":"35b764da4c047cfce2de11c3089e52cf768b4b679f8613bd748211f67113325b"} Apr 20 21:47:54.680062 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:54.680016 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" podStartSLOduration=18.767036108 podStartE2EDuration="28.680003581s" podCreationTimestamp="2026-04-20 21:47:26 +0000 UTC" firstStartedPulling="2026-04-20 21:47:44.187982887 +0000 UTC m=+34.348837005" lastFinishedPulling="2026-04-20 21:47:54.100950357 +0000 UTC m=+44.261804478" observedRunningTime="2026-04-20 21:47:54.679866538 +0000 UTC m=+44.840720675" watchObservedRunningTime="2026-04-20 21:47:54.680003581 +0000 UTC m=+44.840857721" Apr 20 21:47:54.680567 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:54.680544 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" podStartSLOduration=22.626672411 podStartE2EDuration="28.680538505s" podCreationTimestamp="2026-04-20 21:47:26 +0000 UTC" firstStartedPulling="2026-04-20 21:47:44.17112056 +0000 UTC m=+34.331974678" lastFinishedPulling="2026-04-20 21:47:50.22498664 +0000 UTC m=+40.385840772" observedRunningTime="2026-04-20 21:47:50.694664151 +0000 UTC m=+40.855518303" watchObservedRunningTime="2026-04-20 21:47:54.680538505 +0000 UTC m=+44.841392644" Apr 20 21:47:59.485782 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:59.485742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:47:59.486148 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.485898 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:47:59.486148 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.485924 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:47:59.486148 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.485982 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.485967126 +0000 UTC m=+65.646821245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:47:59.587236 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:59.587199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:47:59.587415 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:47:59.587268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:47:59.587415 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.587335 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:47:59.587415 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.587390 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:47:59.587514 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.587424 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.58740786 +0000 UTC m=+65.748261978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:47:59.587514 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:47:59.587439 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:15.587433116 +0000 UTC m=+65.748287233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:48:08.621257 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:08.621230 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mlfps" Apr 20 21:48:15.505630 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:15.505590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:48:15.506027 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.505726 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:48:15.506027 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.505737 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:48:15.506027 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.505795 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.505781635 +0000 UTC m=+97.666635753 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:48:15.606464 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:15.606433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:48:15.606636 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:15.606523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:48:15.606636 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.606574 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:48:15.606732 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.606640 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.606625856 +0000 UTC m=+97.767479975 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:48:15.606732 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.606653 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:48:15.606732 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:15.606705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:48:47.60669089 +0000 UTC m=+97.767545007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:48:16.110857 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.110818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:48:16.111022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.110913 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:48:16.113120 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.113099 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 21:48:16.113166 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.113153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 21:48:16.121878 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:16.121860 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:48:16.121922 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:16.121911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:20.121897193 +0000 UTC m=+130.282751311 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : secret "metrics-daemon-secret" not found Apr 20 21:48:16.124931 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.124896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f57a85f5-bd23-4292-9e22-6f0078a7e4f0-original-pull-secret\") pod \"global-pull-secret-syncer-7524q\" (UID: \"f57a85f5-bd23-4292-9e22-6f0078a7e4f0\") " pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:48:16.211512 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.211472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:48:16.213654 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.213637 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 21:48:16.223774 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.223756 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 21:48:16.234415 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.234389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jc2f\" (UniqueName: \"kubernetes.io/projected/fe4be124-58a1-4591-b319-21b9bcd1aae4-kube-api-access-5jc2f\") pod \"network-check-target-hj6pg\" (UID: \"fe4be124-58a1-4591-b319-21b9bcd1aae4\") " pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:48:16.346473 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.346441 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hbjh6\"" Apr 20 21:48:16.354826 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.354805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:48:16.354957 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.354899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7524q" Apr 20 21:48:16.477875 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.477841 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hj6pg"] Apr 20 21:48:16.482718 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:48:16.482687 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe4be124_58a1_4591_b319_21b9bcd1aae4.slice/crio-3cee1af70f2f0518b385994564f0bbdaab0332450b89eaff81b7f30400908a66 WatchSource:0}: Error finding container 3cee1af70f2f0518b385994564f0bbdaab0332450b89eaff81b7f30400908a66: Status 404 returned error can't find the container with id 3cee1af70f2f0518b385994564f0bbdaab0332450b89eaff81b7f30400908a66 Apr 20 21:48:16.492897 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.492875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7524q"] Apr 20 21:48:16.495404 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:48:16.495383 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57a85f5_bd23_4292_9e22_6f0078a7e4f0.slice/crio-6ca84ad2eaa43c5b4c6ef3ea5bf94cd3c20052902474c6d7bd46c16e6ab36508 WatchSource:0}: Error finding container 6ca84ad2eaa43c5b4c6ef3ea5bf94cd3c20052902474c6d7bd46c16e6ab36508: Status 404 returned error can't find the container with id 6ca84ad2eaa43c5b4c6ef3ea5bf94cd3c20052902474c6d7bd46c16e6ab36508 Apr 20 21:48:16.704853 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.704763 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7524q" event={"ID":"f57a85f5-bd23-4292-9e22-6f0078a7e4f0","Type":"ContainerStarted","Data":"6ca84ad2eaa43c5b4c6ef3ea5bf94cd3c20052902474c6d7bd46c16e6ab36508"} Apr 20 21:48:16.705766 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:16.705741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hj6pg" event={"ID":"fe4be124-58a1-4591-b319-21b9bcd1aae4","Type":"ContainerStarted","Data":"3cee1af70f2f0518b385994564f0bbdaab0332450b89eaff81b7f30400908a66"} Apr 20 21:48:21.718864 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:21.718834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hj6pg" event={"ID":"fe4be124-58a1-4591-b319-21b9bcd1aae4","Type":"ContainerStarted","Data":"cc04d2d176405d9a2333ff07f0a57bb4a97a6bd3573091922c6a36d89786d3ec"} Apr 20 21:48:21.734523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:21.734348 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:48:21.748912 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:21.748846 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hj6pg" podStartSLOduration=66.887338775 podStartE2EDuration="1m11.748830459s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:48:16.484526249 +0000 UTC m=+66.645380367" lastFinishedPulling="2026-04-20 21:48:21.34601793 +0000 UTC m=+71.506872051" observedRunningTime="2026-04-20 21:48:21.747208435 +0000 UTC m=+71.908062590" watchObservedRunningTime="2026-04-20 21:48:21.748830459 +0000 UTC m=+71.909684598" Apr 20 21:48:22.723114 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:22.723063 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7524q" event={"ID":"f57a85f5-bd23-4292-9e22-6f0078a7e4f0","Type":"ContainerStarted","Data":"7a16042e87005d18f6104abd6fde9d10772452c33311ecf822427bb553f24dbe"} Apr 20 21:48:22.736124 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:22.736084 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7524q" podStartSLOduration=66.531298323 podStartE2EDuration="1m11.736071109s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="2026-04-20 21:48:16.497043508 +0000 UTC m=+66.657897630" lastFinishedPulling="2026-04-20 21:48:21.701816297 +0000 UTC m=+71.862670416" observedRunningTime="2026-04-20 21:48:22.735947464 +0000 UTC m=+72.896801606" watchObservedRunningTime="2026-04-20 21:48:22.736071109 +0000 UTC m=+72.896925249" Apr 20 21:48:47.528124 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:47.528076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:48:47.528581 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.528227 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 21:48:47.528581 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.528246 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-589fd86f94-t58lj: secret "image-registry-tls" not found Apr 20 21:48:47.528581 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.528314 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls podName:e7dbf0fe-03e6-46ca-88a7-16abae2daac1 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.528298452 +0000 UTC m=+161.689152570 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls") pod "image-registry-589fd86f94-t58lj" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1") : secret "image-registry-tls" not found Apr 20 21:48:47.628969 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:47.628940 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:48:47.629101 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:47.628991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:48:47.629101 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.629072 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 21:48:47.629101 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.629073 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 21:48:47.629200 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.629121 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls podName:2f2509d8-512e-4191-a295-3e79802650ac nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.629108066 +0000 UTC m=+161.789962184 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls") pod "dns-default-vb97h" (UID: "2f2509d8-512e-4191-a295-3e79802650ac") : secret "dns-default-metrics-tls" not found Apr 20 21:48:47.629200 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:48:47.629135 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert podName:bf0d97d7-a0a1-4f99-802a-39ac411ff714 nodeName:}" failed. No retries permitted until 2026-04-20 21:49:51.629128499 +0000 UTC m=+161.789982617 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert") pod "ingress-canary-gm5l7" (UID: "bf0d97d7-a0a1-4f99-802a-39ac411ff714") : secret "canary-serving-cert" not found Apr 20 21:48:52.725383 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:48:52.725351 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hj6pg" Apr 20 21:49:20.168149 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:20.168088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:49:20.168592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:20.168249 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 21:49:20.168592 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:20.168328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs podName:a08eea80-f553-4499-a8dc-94c9591d8221 nodeName:}" failed. No retries permitted until 2026-04-20 21:51:22.168312157 +0000 UTC m=+252.329166279 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs") pod "network-metrics-daemon-xmrt9" (UID: "a08eea80-f553-4499-a8dc-94c9591d8221") : secret "metrics-daemon-secret" not found Apr 20 21:49:38.133056 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:38.133027 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6drvt_3a1935ff-0056-494d-bd40-1316c97c620f/dns-node-resolver/0.log" Apr 20 21:49:39.532734 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:39.532709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j5w8c_25ce781b-7c4c-499a-bc4a-2efb25261488/node-ca/0.log" Apr 20 21:49:46.678611 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:46.678561 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" Apr 20 21:49:46.749942 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:46.749895 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vb97h" podUID="2f2509d8-512e-4191-a295-3e79802650ac" Apr 20 21:49:46.771818 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:46.771784 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-gm5l7" podUID="bf0d97d7-a0a1-4f99-802a-39ac411ff714" Apr 20 21:49:46.918415 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:46.918327 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vb97h" Apr 20 21:49:46.918544 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:46.918327 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:49:47.437022 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:49:47.436980 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xmrt9" podUID="a08eea80-f553-4499-a8dc-94c9591d8221" Apr 20 21:49:50.652949 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.652888 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" podUID="2294a10e-a1dc-4759-a425-e047c7157139" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 20 21:49:50.928469 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.928435 2574 generic.go:358] "Generic (PLEG): container finished" podID="2294a10e-a1dc-4759-a425-e047c7157139" containerID="667863bbeb13f668d1d3366e02860338c6ffcb667ddfd9ea8b060ac3f9fc9498" exitCode=1 Apr 20 21:49:50.928649 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.928509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" event={"ID":"2294a10e-a1dc-4759-a425-e047c7157139","Type":"ContainerDied","Data":"667863bbeb13f668d1d3366e02860338c6ffcb667ddfd9ea8b060ac3f9fc9498"} Apr 20 21:49:50.928892 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.928866 2574 scope.go:117] "RemoveContainer" containerID="667863bbeb13f668d1d3366e02860338c6ffcb667ddfd9ea8b060ac3f9fc9498" Apr 20 21:49:50.929787 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.929767 2574 generic.go:358] "Generic (PLEG): container finished" podID="8e4c4164-f43b-4dfe-b639-705ced10d164" containerID="88ae3e0680b990499ea8bd904ad6f8f77d1b39c1c591f9c7c0a34050a5b0c005" exitCode=255 Apr 20 21:49:50.929862 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.929796 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" event={"ID":"8e4c4164-f43b-4dfe-b639-705ced10d164","Type":"ContainerDied","Data":"88ae3e0680b990499ea8bd904ad6f8f77d1b39c1c591f9c7c0a34050a5b0c005"} Apr 20 21:49:50.930166 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:50.930087 2574 scope.go:117] "RemoveContainer" containerID="88ae3e0680b990499ea8bd904ad6f8f77d1b39c1c591f9c7c0a34050a5b0c005" Apr 20 21:49:51.592421 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.592362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:49:51.594712 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.594689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"image-registry-589fd86f94-t58lj\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:49:51.693500 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.693462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:49:51.693861 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.693525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:49:51.695797 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.695776 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f2509d8-512e-4191-a295-3e79802650ac-metrics-tls\") pod \"dns-default-vb97h\" (UID: \"2f2509d8-512e-4191-a295-3e79802650ac\") " pod="openshift-dns/dns-default-vb97h" Apr 20 21:49:51.695890 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.695874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf0d97d7-a0a1-4f99-802a-39ac411ff714-cert\") pod \"ingress-canary-gm5l7\" (UID: \"bf0d97d7-a0a1-4f99-802a-39ac411ff714\") " pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:49:51.722115 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.722092 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rxx7z\"" Apr 20 21:49:51.722227 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.722092 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-s4j5s\"" Apr 20 21:49:51.730746 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.730723 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vb97h" Apr 20 21:49:51.730746 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.730736 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:49:51.854465 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.854438 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vb97h"] Apr 20 21:49:51.857996 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:49:51.857966 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2509d8_512e_4191_a295_3e79802650ac.slice/crio-7a0cc036e06e50358c6de133b7126488569a1e9887bec03a4598e48d8aab3ad4 WatchSource:0}: Error finding container 7a0cc036e06e50358c6de133b7126488569a1e9887bec03a4598e48d8aab3ad4: Status 404 returned error can't find the container with id 7a0cc036e06e50358c6de133b7126488569a1e9887bec03a4598e48d8aab3ad4 Apr 20 21:49:51.884844 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.884797 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:49:51.889263 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:49:51.889237 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dbf0fe_03e6_46ca_88a7_16abae2daac1.slice/crio-a0ecd155b3fec02c569b6efed9b4131446b6d3679e29195a2b7229425a06c9a7 WatchSource:0}: Error finding container a0ecd155b3fec02c569b6efed9b4131446b6d3679e29195a2b7229425a06c9a7: Status 404 returned error can't find the container with id a0ecd155b3fec02c569b6efed9b4131446b6d3679e29195a2b7229425a06c9a7 Apr 20 21:49:51.933411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.933386 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vb97h" event={"ID":"2f2509d8-512e-4191-a295-3e79802650ac","Type":"ContainerStarted","Data":"7a0cc036e06e50358c6de133b7126488569a1e9887bec03a4598e48d8aab3ad4"} Apr 20 21:49:51.934355 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.934329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" event={"ID":"e7dbf0fe-03e6-46ca-88a7-16abae2daac1","Type":"ContainerStarted","Data":"a0ecd155b3fec02c569b6efed9b4131446b6d3679e29195a2b7229425a06c9a7"} Apr 20 21:49:51.935806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.935786 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" event={"ID":"2294a10e-a1dc-4759-a425-e047c7157139","Type":"ContainerStarted","Data":"564eb60c595feeafb34f8e3f712b5205af70f7145b7d1ffa26985ba87d87bc83"} Apr 20 21:49:51.936078 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.936050 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:49:51.936617 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.936601 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-758cb8bfc4-ngvgm" Apr 20 21:49:51.937515 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:51.937493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5dcf5bfbfc-tgs4w" event={"ID":"8e4c4164-f43b-4dfe-b639-705ced10d164","Type":"ContainerStarted","Data":"065d9105ff79716fa6af4110744ce4c1cffe99200f116b20350f913ce97498a5"} Apr 20 21:49:52.941793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:52.941753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" event={"ID":"e7dbf0fe-03e6-46ca-88a7-16abae2daac1","Type":"ContainerStarted","Data":"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e"} Apr 20 21:49:52.962838 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:52.962783 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" podStartSLOduration=161.962763212 podStartE2EDuration="2m41.962763212s" podCreationTimestamp="2026-04-20 21:47:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:49:52.961626193 +0000 UTC m=+163.122480332" watchObservedRunningTime="2026-04-20 21:49:52.962763212 +0000 UTC m=+163.123617367" Apr 20 21:49:53.946721 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:53.946688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vb97h" event={"ID":"2f2509d8-512e-4191-a295-3e79802650ac","Type":"ContainerStarted","Data":"95e486f28eadba378d8d92fe48c2dc60382cda68e91617cd0ebc9097681c2699"} Apr 20 21:49:53.947135 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:53.946730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vb97h" event={"ID":"2f2509d8-512e-4191-a295-3e79802650ac","Type":"ContainerStarted","Data":"81e0f9a59afc57cc5ea5c6a47e7fbd2b4bf5c2176a02dddaf016a504edc1ebbf"} Apr 20 21:49:53.947135 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:53.946827 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:49:53.961552 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:53.961493 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vb97h" podStartSLOduration=129.767812113 podStartE2EDuration="2m10.96148067s" podCreationTimestamp="2026-04-20 21:47:43 +0000 UTC" firstStartedPulling="2026-04-20 21:49:51.859895519 +0000 UTC m=+162.020749637" lastFinishedPulling="2026-04-20 21:49:53.053564063 +0000 UTC m=+163.214418194" observedRunningTime="2026-04-20 21:49:53.961055014 +0000 UTC m=+164.121909151" watchObservedRunningTime="2026-04-20 21:49:53.96148067 +0000 UTC m=+164.122334822" Apr 20 21:49:54.950197 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:54.950165 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vb97h" Apr 20 21:49:59.426824 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.426727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:49:59.427163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.426731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:49:59.428971 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.428954 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bqmdj\"" Apr 20 21:49:59.437427 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.437410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gm5l7" Apr 20 21:49:59.546786 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.546756 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gm5l7"] Apr 20 21:49:59.550060 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:49:59.550037 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0d97d7_a0a1_4f99_802a_39ac411ff714.slice/crio-0514977f3e26646f1513f65dad7009fa9b60102f590ba34e23ce5dcda71b4eac WatchSource:0}: Error finding container 0514977f3e26646f1513f65dad7009fa9b60102f590ba34e23ce5dcda71b4eac: Status 404 returned error can't find the container with id 0514977f3e26646f1513f65dad7009fa9b60102f590ba34e23ce5dcda71b4eac Apr 20 21:49:59.753472 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.753393 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hmbdw"] Apr 20 21:49:59.757384 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.757355 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.759570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.759549 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 21:49:59.759655 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.759589 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 21:49:59.760515 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.760500 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 21:49:59.760673 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.760657 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 21:49:59.760773 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.760758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-txw8c\"" Apr 20 21:49:59.773279 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.773256 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hmbdw"] Apr 20 21:49:59.849410 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.849362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9rzx\" (UniqueName: \"kubernetes.io/projected/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-api-access-b9rzx\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.849576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.849422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fedb37b6-dbb2-4b57-ba53-813cae30c648-data-volume\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.849576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.849454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fedb37b6-dbb2-4b57-ba53-813cae30c648-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.849576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.849511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fedb37b6-dbb2-4b57-ba53-813cae30c648-crio-socket\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.849576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.849562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.950815 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.950778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fedb37b6-dbb2-4b57-ba53-813cae30c648-data-volume\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.950994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.950827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fedb37b6-dbb2-4b57-ba53-813cae30c648-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.950994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.950858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fedb37b6-dbb2-4b57-ba53-813cae30c648-crio-socket\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.950994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.950906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.950994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.950959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9rzx\" (UniqueName: \"kubernetes.io/projected/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-api-access-b9rzx\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.951206 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.951005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fedb37b6-dbb2-4b57-ba53-813cae30c648-crio-socket\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.951206 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.951186 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fedb37b6-dbb2-4b57-ba53-813cae30c648-data-volume\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.951576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.951551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.953333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.953313 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fedb37b6-dbb2-4b57-ba53-813cae30c648-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.961298 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.961273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9rzx\" (UniqueName: \"kubernetes.io/projected/fedb37b6-dbb2-4b57-ba53-813cae30c648-kube-api-access-b9rzx\") pod \"insights-runtime-extractor-hmbdw\" (UID: \"fedb37b6-dbb2-4b57-ba53-813cae30c648\") " pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:49:59.965501 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:49:59.965462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gm5l7" event={"ID":"bf0d97d7-a0a1-4f99-802a-39ac411ff714","Type":"ContainerStarted","Data":"0514977f3e26646f1513f65dad7009fa9b60102f590ba34e23ce5dcda71b4eac"} Apr 20 21:50:00.067073 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:00.066999 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hmbdw" Apr 20 21:50:00.198810 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:00.198779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hmbdw"] Apr 20 21:50:00.202324 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:50:00.202298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfedb37b6_dbb2_4b57_ba53_813cae30c648.slice/crio-22e2703151e4d3ada17966bebb959c7e344759d96351b43c9b81923b5b4eecbf WatchSource:0}: Error finding container 22e2703151e4d3ada17966bebb959c7e344759d96351b43c9b81923b5b4eecbf: Status 404 returned error can't find the container with id 22e2703151e4d3ada17966bebb959c7e344759d96351b43c9b81923b5b4eecbf Apr 20 21:50:00.969734 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:00.969694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmbdw" event={"ID":"fedb37b6-dbb2-4b57-ba53-813cae30c648","Type":"ContainerStarted","Data":"87d6361b437f560ee12c32aaa88116bfc39a3936f3082b393738de1199d467b3"} Apr 20 21:50:00.969734 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:00.969730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmbdw" event={"ID":"fedb37b6-dbb2-4b57-ba53-813cae30c648","Type":"ContainerStarted","Data":"22e2703151e4d3ada17966bebb959c7e344759d96351b43c9b81923b5b4eecbf"} Apr 20 21:50:01.974035 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:01.973997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gm5l7" event={"ID":"bf0d97d7-a0a1-4f99-802a-39ac411ff714","Type":"ContainerStarted","Data":"b55b63272767a1098218338189e8fb06d3a4247ea4be301dab71af608148ef37"} Apr 20 21:50:01.975918 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:01.975889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmbdw" event={"ID":"fedb37b6-dbb2-4b57-ba53-813cae30c648","Type":"ContainerStarted","Data":"097753993c683c57921ebade125edcf5fca218a0c1a53781dd8ba36e8f4dc39a"} Apr 20 21:50:01.987957 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:01.987895 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gm5l7" podStartSLOduration=137.554774934 podStartE2EDuration="2m18.987880554s" podCreationTimestamp="2026-04-20 21:47:43 +0000 UTC" firstStartedPulling="2026-04-20 21:49:59.551779187 +0000 UTC m=+169.712633304" lastFinishedPulling="2026-04-20 21:50:00.984884801 +0000 UTC m=+171.145738924" observedRunningTime="2026-04-20 21:50:01.987415267 +0000 UTC m=+172.148269408" watchObservedRunningTime="2026-04-20 21:50:01.987880554 +0000 UTC m=+172.148734699" Apr 20 21:50:02.980320 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:02.980232 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmbdw" event={"ID":"fedb37b6-dbb2-4b57-ba53-813cae30c648","Type":"ContainerStarted","Data":"8b476f06ba7828d09d1f3549f32760411c4202422de61c41b68a02959a9b28a5"} Apr 20 21:50:02.995857 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:02.995733 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hmbdw" podStartSLOduration=1.650448174 podStartE2EDuration="3.995721434s" podCreationTimestamp="2026-04-20 21:49:59 +0000 UTC" firstStartedPulling="2026-04-20 21:50:00.268720531 +0000 UTC m=+170.429574649" lastFinishedPulling="2026-04-20 21:50:02.613993786 +0000 UTC m=+172.774847909" observedRunningTime="2026-04-20 21:50:02.995492903 +0000 UTC m=+173.156347067" watchObservedRunningTime="2026-04-20 21:50:02.995721434 +0000 UTC m=+173.156575573" Apr 20 21:50:04.955667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:04.955636 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vb97h" Apr 20 21:50:11.734991 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:11.734952 2574 patch_prober.go:28] interesting pod/image-registry-589fd86f94-t58lj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 21:50:11.735445 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:11.735001 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 21:50:13.000770 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.000735 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ccgvl"] Apr 20 21:50:13.006415 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.006394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.009164 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.009455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.009656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-m7svp\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.009823 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.010037 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.010078 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 21:50:13.010650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.010257 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 21:50:13.048344 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-sys\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048344 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-tls\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048508 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048508 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szxh9\" (UniqueName: \"kubernetes.io/projected/29c6b565-3905-48a9-b7e8-3853908ddeb8-kube-api-access-szxh9\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048508 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048459 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-accelerators-collector-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048508 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048505 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-wtmp\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-textfile\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-root\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.048657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.048574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-metrics-client-ca\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.148906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.148879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-textfile\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.148906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.148907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-root\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.148922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-metrics-client-ca\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.148968 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-sys\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.148992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-tls\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149003 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-root\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szxh9\" (UniqueName: \"kubernetes.io/projected/29c6b565-3905-48a9-b7e8-3853908ddeb8-kube-api-access-szxh9\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-sys\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-accelerators-collector-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149470 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-wtmp\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149470 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-wtmp\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149470 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-textfile\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149620 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-metrics-client-ca\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.149737 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.149714 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-accelerators-collector-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.151323 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.151299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-tls\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.151541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.151523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29c6b565-3905-48a9-b7e8-3853908ddeb8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.158230 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.158212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szxh9\" (UniqueName: \"kubernetes.io/projected/29c6b565-3905-48a9-b7e8-3853908ddeb8-kube-api-access-szxh9\") pod \"node-exporter-ccgvl\" (UID: \"29c6b565-3905-48a9-b7e8-3853908ddeb8\") " pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.318598 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:13.318527 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ccgvl" Apr 20 21:50:13.328972 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:50:13.328939 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c6b565_3905_48a9_b7e8_3853908ddeb8.slice/crio-8004fe573a8c0e0995f409ac3493215530aaa5953fa46825c21bb7dc76f3fc65 WatchSource:0}: Error finding container 8004fe573a8c0e0995f409ac3493215530aaa5953fa46825c21bb7dc76f3fc65: Status 404 returned error can't find the container with id 8004fe573a8c0e0995f409ac3493215530aaa5953fa46825c21bb7dc76f3fc65 Apr 20 21:50:14.008143 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:14.008108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ccgvl" event={"ID":"29c6b565-3905-48a9-b7e8-3853908ddeb8","Type":"ContainerStarted","Data":"8004fe573a8c0e0995f409ac3493215530aaa5953fa46825c21bb7dc76f3fc65"} Apr 20 21:50:14.953907 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:14.953872 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:50:15.011796 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:15.011758 2574 generic.go:358] "Generic (PLEG): container finished" podID="29c6b565-3905-48a9-b7e8-3853908ddeb8" containerID="058ade5a7494b1a7fdd3738970ff1fe0bbda93b83e996bbe2bb342d77339c123" exitCode=0 Apr 20 21:50:15.012136 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:15.011813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ccgvl" event={"ID":"29c6b565-3905-48a9-b7e8-3853908ddeb8","Type":"ContainerDied","Data":"058ade5a7494b1a7fdd3738970ff1fe0bbda93b83e996bbe2bb342d77339c123"} Apr 20 21:50:16.015854 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:16.015817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ccgvl" event={"ID":"29c6b565-3905-48a9-b7e8-3853908ddeb8","Type":"ContainerStarted","Data":"65967ac563a55f255a3a0265b4fa84c882989a39f22b5f59bcd14ab2799d02ae"} Apr 20 21:50:16.015854 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:16.015849 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ccgvl" event={"ID":"29c6b565-3905-48a9-b7e8-3853908ddeb8","Type":"ContainerStarted","Data":"7bad781fc58b25eb1eaf1ac14f32e4b245aabb10a8971475a4c4a80283218f90"} Apr 20 21:50:16.036684 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:16.036638 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ccgvl" podStartSLOduration=3.3208756790000002 podStartE2EDuration="4.03662535s" podCreationTimestamp="2026-04-20 21:50:12 +0000 UTC" firstStartedPulling="2026-04-20 21:50:13.331046613 +0000 UTC m=+183.491900737" lastFinishedPulling="2026-04-20 21:50:14.046796277 +0000 UTC m=+184.207650408" observedRunningTime="2026-04-20 21:50:16.03580885 +0000 UTC m=+186.196662989" watchObservedRunningTime="2026-04-20 21:50:16.03662535 +0000 UTC m=+186.197479489" Apr 20 21:50:22.166461 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:22.166430 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:50:34.015709 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:34.015669 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" podUID="c31613c1-5256-4da7-9941-7c734fc3dce2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:50:44.015188 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:44.015145 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" podUID="c31613c1-5256-4da7-9941-7c734fc3dce2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:50:47.188355 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.188296 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerName="registry" containerID="cri-o://762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e" gracePeriod=30 Apr 20 21:50:47.413837 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.413816 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:50:47.496910 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.496829 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.496910 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.496868 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxx5r\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.496910 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.496907 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.496925 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.496949 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.497055 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.497110 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.497152 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration\") pod \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\" (UID: \"e7dbf0fe-03e6-46ca-88a7-16abae2daac1\") " Apr 20 21:50:47.497838 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.497728 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:47.497838 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.497770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:50:47.499495 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.499440 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r" (OuterVolumeSpecName: "kube-api-access-pxx5r") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "kube-api-access-pxx5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:47.499596 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.499575 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:47.499596 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.499579 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:47.499707 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.499647 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:50:47.499769 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.499746 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:50:47.505788 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.505744 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e7dbf0fe-03e6-46ca-88a7-16abae2daac1" (UID: "e7dbf0fe-03e6-46ca-88a7-16abae2daac1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:50:47.598249 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598216 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-ca-trust-extracted\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598249 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598244 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-trusted-ca\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598249 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598255 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-image-registry-private-configuration\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598264 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-installation-pull-secrets\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598276 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxx5r\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-kube-api-access-pxx5r\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598285 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-certificates\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598293 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-bound-sa-token\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:47.598493 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:47.598302 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7dbf0fe-03e6-46ca-88a7-16abae2daac1-registry-tls\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:50:48.101163 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.101125 2574 generic.go:358] "Generic (PLEG): container finished" podID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerID="762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e" exitCode=0 Apr 20 21:50:48.101333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.101189 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" Apr 20 21:50:48.101333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.101203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" event={"ID":"e7dbf0fe-03e6-46ca-88a7-16abae2daac1","Type":"ContainerDied","Data":"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e"} Apr 20 21:50:48.101333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.101237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-589fd86f94-t58lj" event={"ID":"e7dbf0fe-03e6-46ca-88a7-16abae2daac1","Type":"ContainerDied","Data":"a0ecd155b3fec02c569b6efed9b4131446b6d3679e29195a2b7229425a06c9a7"} Apr 20 21:50:48.101333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.101252 2574 scope.go:117] "RemoveContainer" containerID="762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e" Apr 20 21:50:48.108793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.108773 2574 scope.go:117] "RemoveContainer" containerID="762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e" Apr 20 21:50:48.109048 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:50:48.109029 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e\": container with ID starting with 762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e not found: ID does not exist" containerID="762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e" Apr 20 21:50:48.109097 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.109058 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e"} err="failed to get container status \"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e\": rpc error: code = NotFound desc = could not find container \"762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e\": container with ID starting with 762dc15bbdc2b81ee4510c955793769a39d38fb29b11a6a6647dd70353d2174e not found: ID does not exist" Apr 20 21:50:48.119024 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.118999 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:50:48.122606 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.122588 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-589fd86f94-t58lj"] Apr 20 21:50:48.433947 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:48.431466 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" path="/var/lib/kubelet/pods/e7dbf0fe-03e6-46ca-88a7-16abae2daac1/volumes" Apr 20 21:50:54.015471 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:54.015422 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" podUID="c31613c1-5256-4da7-9941-7c734fc3dce2" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 21:50:54.015854 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:54.015503 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" Apr 20 21:50:54.015954 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:54.015938 2574 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"823a240f6774e0d5b388a76bdbbe3b9ff8273595f818acc713bc57edcdc86a69"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 21:50:54.015994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:54.015973 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" podUID="c31613c1-5256-4da7-9941-7c734fc3dce2" containerName="service-proxy" containerID="cri-o://823a240f6774e0d5b388a76bdbbe3b9ff8273595f818acc713bc57edcdc86a69" gracePeriod=30 Apr 20 21:50:55.122241 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:55.122202 2574 generic.go:358] "Generic (PLEG): container finished" podID="c31613c1-5256-4da7-9941-7c734fc3dce2" containerID="823a240f6774e0d5b388a76bdbbe3b9ff8273595f818acc713bc57edcdc86a69" exitCode=2 Apr 20 21:50:55.122652 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:55.122266 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerDied","Data":"823a240f6774e0d5b388a76bdbbe3b9ff8273595f818acc713bc57edcdc86a69"} Apr 20 21:50:55.122652 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:50:55.122304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-799c57c576-2mgtk" event={"ID":"c31613c1-5256-4da7-9941-7c734fc3dce2","Type":"ContainerStarted","Data":"e82f25f23d3f7505f94ac507a155040a65b97b0aa4c0c943a7d84a9ba20a3788"} Apr 20 21:51:22.254524 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:22.254474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:51:22.256715 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:22.256692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a08eea80-f553-4499-a8dc-94c9591d8221-metrics-certs\") pod \"network-metrics-daemon-xmrt9\" (UID: \"a08eea80-f553-4499-a8dc-94c9591d8221\") " pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:51:22.530503 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:22.530433 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5wzfs\"" Apr 20 21:51:22.538056 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:22.538028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xmrt9" Apr 20 21:51:22.658002 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:22.657972 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xmrt9"] Apr 20 21:51:22.662426 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:51:22.662396 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08eea80_f553_4499_a8dc_94c9591d8221.slice/crio-d8229a4a6b061270f63c85a64e65c8b275f16038f71c21d7c4d6917cdd83e344 WatchSource:0}: Error finding container d8229a4a6b061270f63c85a64e65c8b275f16038f71c21d7c4d6917cdd83e344: Status 404 returned error can't find the container with id d8229a4a6b061270f63c85a64e65c8b275f16038f71c21d7c4d6917cdd83e344 Apr 20 21:51:23.189876 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:23.189846 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xmrt9" event={"ID":"a08eea80-f553-4499-a8dc-94c9591d8221","Type":"ContainerStarted","Data":"d8229a4a6b061270f63c85a64e65c8b275f16038f71c21d7c4d6917cdd83e344"} Apr 20 21:51:24.197102 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:24.197014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xmrt9" event={"ID":"a08eea80-f553-4499-a8dc-94c9591d8221","Type":"ContainerStarted","Data":"0d0ed8288bef7180397d4eb9af5a4cdfeccd81ba184e7ad558eac2cedd892904"} Apr 20 21:51:24.197102 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:24.197064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xmrt9" event={"ID":"a08eea80-f553-4499-a8dc-94c9591d8221","Type":"ContainerStarted","Data":"ebe483e23edd54d1106cac72ff9b0c421d442747e0ff004a868d584d70ee1a7a"} Apr 20 21:51:24.217981 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:51:24.217928 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xmrt9" podStartSLOduration=253.223688774 podStartE2EDuration="4m14.217913659s" podCreationTimestamp="2026-04-20 21:47:10 +0000 UTC" firstStartedPulling="2026-04-20 21:51:22.66428921 +0000 UTC m=+252.825143343" lastFinishedPulling="2026-04-20 21:51:23.658514107 +0000 UTC m=+253.819368228" observedRunningTime="2026-04-20 21:51:24.216734508 +0000 UTC m=+254.377588648" watchObservedRunningTime="2026-04-20 21:51:24.217913659 +0000 UTC m=+254.378767799" Apr 20 21:52:10.317724 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:52:10.317698 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:52:10.318239 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:52:10.318139 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:54:25.586115 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.586080 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4w89g"] Apr 20 21:54:25.586736 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.586405 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerName="registry" Apr 20 21:54:25.586736 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.586425 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerName="registry" Apr 20 21:54:25.586736 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.586491 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7dbf0fe-03e6-46ca-88a7-16abae2daac1" containerName="registry" Apr 20 21:54:25.589224 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.589202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.591254 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.591235 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 21:54:25.591681 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.591654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 21:54:25.591787 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.591678 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-c72kc\"" Apr 20 21:54:25.596196 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.596174 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4w89g"] Apr 20 21:54:25.690872 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.690840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmbgs\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-kube-api-access-nmbgs\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.691047 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.690886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.791549 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.791520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmbgs\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-kube-api-access-nmbgs\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.791709 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.791562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.798776 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.798750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.799075 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.799052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmbgs\" (UniqueName: \"kubernetes.io/projected/da3712bf-baf3-4aaf-aff1-e9c0c015b12b-kube-api-access-nmbgs\") pod \"cert-manager-cainjector-68b757865b-4w89g\" (UID: \"da3712bf-baf3-4aaf-aff1-e9c0c015b12b\") " pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:25.898013 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:25.897979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" Apr 20 21:54:26.007188 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:26.007155 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-4w89g"] Apr 20 21:54:26.012671 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:54:26.012645 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda3712bf_baf3_4aaf_aff1_e9c0c015b12b.slice/crio-3290687b032c5605b028ceed72bb65ca70d298d9722e63b035a41017d5c2151a WatchSource:0}: Error finding container 3290687b032c5605b028ceed72bb65ca70d298d9722e63b035a41017d5c2151a: Status 404 returned error can't find the container with id 3290687b032c5605b028ceed72bb65ca70d298d9722e63b035a41017d5c2151a Apr 20 21:54:26.014764 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:26.014746 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:54:26.648700 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:26.648650 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" event={"ID":"da3712bf-baf3-4aaf-aff1-e9c0c015b12b","Type":"ContainerStarted","Data":"3290687b032c5605b028ceed72bb65ca70d298d9722e63b035a41017d5c2151a"} Apr 20 21:54:29.659845 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:29.659806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" event={"ID":"da3712bf-baf3-4aaf-aff1-e9c0c015b12b","Type":"ContainerStarted","Data":"e6e774171ff14ade1af58699f1718e08a5a8233225d9db6dca4fc683a1cad6cb"} Apr 20 21:54:29.673958 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:54:29.673912 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-4w89g" podStartSLOduration=1.94545138 podStartE2EDuration="4.673900452s" podCreationTimestamp="2026-04-20 21:54:25 +0000 UTC" firstStartedPulling="2026-04-20 21:54:26.014934781 +0000 UTC m=+436.175788913" lastFinishedPulling="2026-04-20 21:54:28.743383849 +0000 UTC m=+438.904237985" observedRunningTime="2026-04-20 21:54:29.673274944 +0000 UTC m=+439.834129095" watchObservedRunningTime="2026-04-20 21:54:29.673900452 +0000 UTC m=+439.834754591" Apr 20 21:55:00.065999 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.065965 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt"] Apr 20 21:55:00.069002 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.068981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.071102 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.071077 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 21:55:00.071224 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.071077 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 21:55:00.071400 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.071384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 21:55:00.071446 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.071435 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 21:55:00.071495 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.071458 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lmzqf\"" Apr 20 21:55:00.084392 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.084352 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt"] Apr 20 21:55:00.132726 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.132695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.132726 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.132733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.132948 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.132795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttzq\" (UniqueName: \"kubernetes.io/projected/e80eab89-a43e-4d96-b8be-1bc48afb35f6-kube-api-access-vttzq\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.233147 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.233106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vttzq\" (UniqueName: \"kubernetes.io/projected/e80eab89-a43e-4d96-b8be-1bc48afb35f6-kube-api-access-vttzq\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.233350 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.233166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.233350 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.233186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.235623 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.235591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.235728 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.235707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e80eab89-a43e-4d96-b8be-1bc48afb35f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.244706 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.244683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttzq\" (UniqueName: \"kubernetes.io/projected/e80eab89-a43e-4d96-b8be-1bc48afb35f6-kube-api-access-vttzq\") pod \"opendatahub-operator-controller-manager-f5f47469b-pqzwt\" (UID: \"e80eab89-a43e-4d96-b8be-1bc48afb35f6\") " pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.380427 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.380395 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:00.503477 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.503446 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt"] Apr 20 21:55:00.507054 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:55:00.507028 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode80eab89_a43e_4d96_b8be_1bc48afb35f6.slice/crio-3fe525178b40939bfc49d936bbf3db80d5495c1707e6ebffb45268336ce8eb23 WatchSource:0}: Error finding container 3fe525178b40939bfc49d936bbf3db80d5495c1707e6ebffb45268336ce8eb23: Status 404 returned error can't find the container with id 3fe525178b40939bfc49d936bbf3db80d5495c1707e6ebffb45268336ce8eb23 Apr 20 21:55:00.734752 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:00.734672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" event={"ID":"e80eab89-a43e-4d96-b8be-1bc48afb35f6","Type":"ContainerStarted","Data":"3fe525178b40939bfc49d936bbf3db80d5495c1707e6ebffb45268336ce8eb23"} Apr 20 21:55:03.745732 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:03.745697 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" event={"ID":"e80eab89-a43e-4d96-b8be-1bc48afb35f6","Type":"ContainerStarted","Data":"0d8d5ad6179567e6b7b3f18361e4301c24ab7f1f29831ab648a88c44b2e24212"} Apr 20 21:55:03.746109 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:03.745809 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:03.763448 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:03.763402 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" podStartSLOduration=1.263459157 podStartE2EDuration="3.763389511s" podCreationTimestamp="2026-04-20 21:55:00 +0000 UTC" firstStartedPulling="2026-04-20 21:55:00.508626879 +0000 UTC m=+470.669480997" lastFinishedPulling="2026-04-20 21:55:03.00855722 +0000 UTC m=+473.169411351" observedRunningTime="2026-04-20 21:55:03.762440349 +0000 UTC m=+473.923294491" watchObservedRunningTime="2026-04-20 21:55:03.763389511 +0000 UTC m=+473.924243643" Apr 20 21:55:05.991781 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:05.991749 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc"] Apr 20 21:55:05.994898 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:05.994882 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:05.998856 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:05.998833 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 21:55:05.999092 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:05.999077 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-jdmz7\"" Apr 20 21:55:05.999365 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:05.999352 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 21:55:06.000068 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.000049 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 21:55:06.007841 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.007822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 21:55:06.007924 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.007856 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 21:55:06.016521 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.016497 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc"] Apr 20 21:55:06.076023 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.075989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.076192 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.076036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzlwm\" (UniqueName: \"kubernetes.io/projected/96507bea-c4f3-42ae-8e23-77ed0ddd303b-kube-api-access-vzlwm\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.076192 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.076067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/96507bea-c4f3-42ae-8e23-77ed0ddd303b-manager-config\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.076192 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.076116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-metrics-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.177029 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.176998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.177160 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.177043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzlwm\" (UniqueName: \"kubernetes.io/projected/96507bea-c4f3-42ae-8e23-77ed0ddd303b-kube-api-access-vzlwm\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.177160 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.177082 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/96507bea-c4f3-42ae-8e23-77ed0ddd303b-manager-config\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.177160 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.177100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-metrics-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.177815 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.177792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/96507bea-c4f3-42ae-8e23-77ed0ddd303b-manager-config\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.179560 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.179539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.179641 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.179574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/96507bea-c4f3-42ae-8e23-77ed0ddd303b-metrics-cert\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.184910 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.184886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzlwm\" (UniqueName: \"kubernetes.io/projected/96507bea-c4f3-42ae-8e23-77ed0ddd303b-kube-api-access-vzlwm\") pod \"lws-controller-manager-796667c6c8-g6npc\" (UID: \"96507bea-c4f3-42ae-8e23-77ed0ddd303b\") " pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.303792 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.303710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:06.424061 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.424023 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc"] Apr 20 21:55:06.428518 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:55:06.428487 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96507bea_c4f3_42ae_8e23_77ed0ddd303b.slice/crio-0f35981e8cfa9a66372fee5d9a858247c061b5b9677c23fe6a60da7e0dc07a29 WatchSource:0}: Error finding container 0f35981e8cfa9a66372fee5d9a858247c061b5b9677c23fe6a60da7e0dc07a29: Status 404 returned error can't find the container with id 0f35981e8cfa9a66372fee5d9a858247c061b5b9677c23fe6a60da7e0dc07a29 Apr 20 21:55:06.753809 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:06.753761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" event={"ID":"96507bea-c4f3-42ae-8e23-77ed0ddd303b","Type":"ContainerStarted","Data":"0f35981e8cfa9a66372fee5d9a858247c061b5b9677c23fe6a60da7e0dc07a29"} Apr 20 21:55:09.769539 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:09.769503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" event={"ID":"96507bea-c4f3-42ae-8e23-77ed0ddd303b","Type":"ContainerStarted","Data":"625d0dbcc8115bceabcb54e4425964854185e7adec1d56517e716063e8cdd369"} Apr 20 21:55:09.769926 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:09.769715 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:55:09.784295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:09.784249 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" podStartSLOduration=2.302817901 podStartE2EDuration="4.784236847s" podCreationTimestamp="2026-04-20 21:55:05 +0000 UTC" firstStartedPulling="2026-04-20 21:55:06.430680239 +0000 UTC m=+476.591534357" lastFinishedPulling="2026-04-20 21:55:08.912099185 +0000 UTC m=+479.072953303" observedRunningTime="2026-04-20 21:55:09.783136654 +0000 UTC m=+479.943990794" watchObservedRunningTime="2026-04-20 21:55:09.784236847 +0000 UTC m=+479.945090987" Apr 20 21:55:14.750838 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:14.750807 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-f5f47469b-pqzwt" Apr 20 21:55:20.774477 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:55:20.774447 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-796667c6c8-g6npc" Apr 20 21:56:03.623189 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.623104 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h"] Apr 20 21:56:03.626526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.626504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.628532 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.628510 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 21:56:03.628690 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.628669 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 21:56:03.628764 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.628735 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 21:56:03.629093 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.629076 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-mv7w5\"" Apr 20 21:56:03.634429 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.634410 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h"] Apr 20 21:56:03.799294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/716e293f-d7b5-4987-87c9-9f07afbd37d3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799294 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqlgl\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-kube-api-access-tqlgl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.799751 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.799548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900543 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/716e293f-d7b5-4987-87c9-9f07afbd37d3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900543 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqlgl\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-kube-api-access-tqlgl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900543 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.900806 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.900769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.901148 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.901025 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.901209 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.901181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.901310 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.901284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/716e293f-d7b5-4987-87c9-9f07afbd37d3-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.901310 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.901295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.901498 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.901406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.903416 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.903398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.903781 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.903759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.909703 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.909678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqlgl\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-kube-api-access-tqlgl\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.909810 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.909795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/716e293f-d7b5-4987-87c9-9f07afbd37d3-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h\" (UID: \"716e293f-d7b5-4987-87c9-9f07afbd37d3\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:03.937890 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:03.937867 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:04.055010 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:04.054979 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h"] Apr 20 21:56:04.058635 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:56:04.058606 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod716e293f_d7b5_4987_87c9_9f07afbd37d3.slice/crio-0ba09382558cd1d7bafbf1f4d7e492cebae6981e23b730897e10cb4ab5c13103 WatchSource:0}: Error finding container 0ba09382558cd1d7bafbf1f4d7e492cebae6981e23b730897e10cb4ab5c13103: Status 404 returned error can't find the container with id 0ba09382558cd1d7bafbf1f4d7e492cebae6981e23b730897e10cb4ab5c13103 Apr 20 21:56:04.914759 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:04.914725 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" event={"ID":"716e293f-d7b5-4987-87c9-9f07afbd37d3","Type":"ContainerStarted","Data":"0ba09382558cd1d7bafbf1f4d7e492cebae6981e23b730897e10cb4ab5c13103"} Apr 20 21:56:06.911461 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:06.911413 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:56:06.911746 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:06.911492 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:56:06.911746 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:06.911521 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:56:07.924683 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:07.924649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" event={"ID":"716e293f-d7b5-4987-87c9-9f07afbd37d3","Type":"ContainerStarted","Data":"3fec4a10cf066cce0c161b9a1d87d4cc31adff40f52cd0d54de50a41eabe4d49"} Apr 20 21:56:07.938460 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:07.938435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:07.943260 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:07.943238 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:07.944183 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:07.944144 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" podStartSLOduration=2.09392773 podStartE2EDuration="4.944130184s" podCreationTimestamp="2026-04-20 21:56:03 +0000 UTC" firstStartedPulling="2026-04-20 21:56:04.060914102 +0000 UTC m=+534.221768220" lastFinishedPulling="2026-04-20 21:56:06.91111655 +0000 UTC m=+537.071970674" observedRunningTime="2026-04-20 21:56:07.942398396 +0000 UTC m=+538.103252527" watchObservedRunningTime="2026-04-20 21:56:07.944130184 +0000 UTC m=+538.104984401" Apr 20 21:56:08.927327 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:08.927294 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:08.928471 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:08.928453 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h" Apr 20 21:56:18.308068 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.308035 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:18.311363 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.311338 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:18.314181 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.314153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-64hdq\"" Apr 20 21:56:18.314181 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.314167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 21:56:18.317820 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.317796 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 21:56:18.321047 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.321024 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:18.402116 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.402081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nblt\" (UniqueName: \"kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt\") pod \"kuadrant-operator-catalog-gxqh9\" (UID: \"587ea1b9-59f3-4a6c-9776-a43975013fa2\") " pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:18.503390 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.503336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nblt\" (UniqueName: \"kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt\") pod \"kuadrant-operator-catalog-gxqh9\" (UID: \"587ea1b9-59f3-4a6c-9776-a43975013fa2\") " pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:18.510520 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.510489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nblt\" (UniqueName: \"kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt\") pod \"kuadrant-operator-catalog-gxqh9\" (UID: \"587ea1b9-59f3-4a6c-9776-a43975013fa2\") " pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:18.628505 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.628480 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:18.676590 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.676528 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:18.747395 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.747340 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:18.750664 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:56:18.750640 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587ea1b9_59f3_4a6c_9776_a43975013fa2.slice/crio-9ad10e3986a1e1563ee263e1e6b1d6ad5564b8b75a0886771b0db3708f23a57c WatchSource:0}: Error finding container 9ad10e3986a1e1563ee263e1e6b1d6ad5564b8b75a0886771b0db3708f23a57c: Status 404 returned error can't find the container with id 9ad10e3986a1e1563ee263e1e6b1d6ad5564b8b75a0886771b0db3708f23a57c Apr 20 21:56:18.884960 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.884885 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6kq45"] Apr 20 21:56:18.889328 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.889307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:18.897487 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.897464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6kq45"] Apr 20 21:56:18.906797 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.906768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n8wc\" (UniqueName: \"kubernetes.io/projected/f855ab67-bf36-4e85-9116-8a11da0acd4c-kube-api-access-9n8wc\") pod \"kuadrant-operator-catalog-6kq45\" (UID: \"f855ab67-bf36-4e85-9116-8a11da0acd4c\") " pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:18.954502 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:18.954473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" event={"ID":"587ea1b9-59f3-4a6c-9776-a43975013fa2","Type":"ContainerStarted","Data":"9ad10e3986a1e1563ee263e1e6b1d6ad5564b8b75a0886771b0db3708f23a57c"} Apr 20 21:56:19.007906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:19.007877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n8wc\" (UniqueName: \"kubernetes.io/projected/f855ab67-bf36-4e85-9116-8a11da0acd4c-kube-api-access-9n8wc\") pod \"kuadrant-operator-catalog-6kq45\" (UID: \"f855ab67-bf36-4e85-9116-8a11da0acd4c\") " pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:19.016004 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:19.015979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n8wc\" (UniqueName: \"kubernetes.io/projected/f855ab67-bf36-4e85-9116-8a11da0acd4c-kube-api-access-9n8wc\") pod \"kuadrant-operator-catalog-6kq45\" (UID: \"f855ab67-bf36-4e85-9116-8a11da0acd4c\") " pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:19.199110 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:19.199030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:19.315601 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:19.315557 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-6kq45"] Apr 20 21:56:19.319847 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:56:19.319818 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf855ab67_bf36_4e85_9116_8a11da0acd4c.slice/crio-6a83f1e5918f4ea8ea22a2bfc959471b24085c3093c1cd742da361b0c98489b2 WatchSource:0}: Error finding container 6a83f1e5918f4ea8ea22a2bfc959471b24085c3093c1cd742da361b0c98489b2: Status 404 returned error can't find the container with id 6a83f1e5918f4ea8ea22a2bfc959471b24085c3093c1cd742da361b0c98489b2 Apr 20 21:56:19.958538 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:19.958505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" event={"ID":"f855ab67-bf36-4e85-9116-8a11da0acd4c","Type":"ContainerStarted","Data":"6a83f1e5918f4ea8ea22a2bfc959471b24085c3093c1cd742da361b0c98489b2"} Apr 20 21:56:20.962740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:20.962693 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" event={"ID":"587ea1b9-59f3-4a6c-9776-a43975013fa2","Type":"ContainerStarted","Data":"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e"} Apr 20 21:56:20.963210 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:20.962788 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" podUID="587ea1b9-59f3-4a6c-9776-a43975013fa2" containerName="registry-server" containerID="cri-o://7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e" gracePeriod=2 Apr 20 21:56:20.964017 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:20.963992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" event={"ID":"f855ab67-bf36-4e85-9116-8a11da0acd4c","Type":"ContainerStarted","Data":"66a9ec9dd2b594817fcf31bcdfebb679192626c9ef1428fbd65ae8f350fc1d88"} Apr 20 21:56:20.977684 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:20.977647 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" podStartSLOduration=0.929671943 podStartE2EDuration="2.97763528s" podCreationTimestamp="2026-04-20 21:56:18 +0000 UTC" firstStartedPulling="2026-04-20 21:56:18.751931152 +0000 UTC m=+548.912785270" lastFinishedPulling="2026-04-20 21:56:20.799894486 +0000 UTC m=+550.960748607" observedRunningTime="2026-04-20 21:56:20.97689762 +0000 UTC m=+551.137751759" watchObservedRunningTime="2026-04-20 21:56:20.97763528 +0000 UTC m=+551.138489420" Apr 20 21:56:20.993006 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:20.992962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" podStartSLOduration=1.511484191 podStartE2EDuration="2.992950425s" podCreationTimestamp="2026-04-20 21:56:18 +0000 UTC" firstStartedPulling="2026-04-20 21:56:19.321136372 +0000 UTC m=+549.481990490" lastFinishedPulling="2026-04-20 21:56:20.802602586 +0000 UTC m=+550.963456724" observedRunningTime="2026-04-20 21:56:20.991650926 +0000 UTC m=+551.152505065" watchObservedRunningTime="2026-04-20 21:56:20.992950425 +0000 UTC m=+551.153804565" Apr 20 21:56:21.197424 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.197403 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:21.227754 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.227724 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nblt\" (UniqueName: \"kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt\") pod \"587ea1b9-59f3-4a6c-9776-a43975013fa2\" (UID: \"587ea1b9-59f3-4a6c-9776-a43975013fa2\") " Apr 20 21:56:21.229800 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.229770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt" (OuterVolumeSpecName: "kube-api-access-2nblt") pod "587ea1b9-59f3-4a6c-9776-a43975013fa2" (UID: "587ea1b9-59f3-4a6c-9776-a43975013fa2"). InnerVolumeSpecName "kube-api-access-2nblt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:56:21.328602 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.328526 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nblt\" (UniqueName: \"kubernetes.io/projected/587ea1b9-59f3-4a6c-9776-a43975013fa2-kube-api-access-2nblt\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:56:21.967864 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.967825 2574 generic.go:358] "Generic (PLEG): container finished" podID="587ea1b9-59f3-4a6c-9776-a43975013fa2" containerID="7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e" exitCode=0 Apr 20 21:56:21.968242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.967881 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" Apr 20 21:56:21.968242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.967910 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" event={"ID":"587ea1b9-59f3-4a6c-9776-a43975013fa2","Type":"ContainerDied","Data":"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e"} Apr 20 21:56:21.968242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.967946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-gxqh9" event={"ID":"587ea1b9-59f3-4a6c-9776-a43975013fa2","Type":"ContainerDied","Data":"9ad10e3986a1e1563ee263e1e6b1d6ad5564b8b75a0886771b0db3708f23a57c"} Apr 20 21:56:21.968242 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.967963 2574 scope.go:117] "RemoveContainer" containerID="7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e" Apr 20 21:56:21.976528 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.976513 2574 scope.go:117] "RemoveContainer" containerID="7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e" Apr 20 21:56:21.976770 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:56:21.976756 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e\": container with ID starting with 7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e not found: ID does not exist" containerID="7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e" Apr 20 21:56:21.976808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.976778 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e"} err="failed to get container status \"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e\": rpc error: code = NotFound desc = could not find container \"7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e\": container with ID starting with 7ec408a9a23b4e477e623f3740ded22b5ae0970c7eefaa69a40614443074378e not found: ID does not exist" Apr 20 21:56:21.987022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.986996 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:21.991889 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:21.991871 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-gxqh9"] Apr 20 21:56:22.434419 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:22.434366 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587ea1b9-59f3-4a6c-9776-a43975013fa2" path="/var/lib/kubelet/pods/587ea1b9-59f3-4a6c-9776-a43975013fa2/volumes" Apr 20 21:56:29.200190 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:29.200154 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:29.200190 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:29.200198 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:29.221274 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:29.221248 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:30.011458 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:30.011431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-6kq45" Apr 20 21:56:48.958956 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.958920 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4"] Apr 20 21:56:48.959511 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.959168 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587ea1b9-59f3-4a6c-9776-a43975013fa2" containerName="registry-server" Apr 20 21:56:48.959511 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.959179 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="587ea1b9-59f3-4a6c-9776-a43975013fa2" containerName="registry-server" Apr 20 21:56:48.959511 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.959231 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="587ea1b9-59f3-4a6c-9776-a43975013fa2" containerName="registry-server" Apr 20 21:56:48.968464 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.968442 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:48.970792 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.970773 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-htr2p\"" Apr 20 21:56:48.977271 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:48.977246 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4"] Apr 20 21:56:49.026397 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.026340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.026526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.026439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzq66\" (UniqueName: \"kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.126745 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.126714 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzq66\" (UniqueName: \"kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.126907 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.126774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.127102 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.127083 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.136570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.136546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzq66\" (UniqueName: \"kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.278782 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.278681 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:49.398979 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:49.398948 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4"] Apr 20 21:56:49.402478 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:56:49.402451 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd490b9a_98e4_47fd_bb76_888c85f17749.slice/crio-a89f49eb51769ba6bab49d37adeaa6e83b14db7af31a36a68620620b093474b7 WatchSource:0}: Error finding container a89f49eb51769ba6bab49d37adeaa6e83b14db7af31a36a68620620b093474b7: Status 404 returned error can't find the container with id a89f49eb51769ba6bab49d37adeaa6e83b14db7af31a36a68620620b093474b7 Apr 20 21:56:50.050988 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:50.050951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" event={"ID":"fd490b9a-98e4-47fd-bb76-888c85f17749","Type":"ContainerStarted","Data":"a89f49eb51769ba6bab49d37adeaa6e83b14db7af31a36a68620620b093474b7"} Apr 20 21:56:53.599488 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.599454 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq"] Apr 20 21:56:53.602888 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.602869 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:53.605175 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.605151 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-g7zfp\"" Apr 20 21:56:53.615478 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.615452 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq"] Apr 20 21:56:53.661318 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.661284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dff28\" (UniqueName: \"kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28\") pod \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" (UID: \"c8213a86-228c-4b9a-b5cb-c124dfc85613\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:53.762337 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.762295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dff28\" (UniqueName: \"kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28\") pod \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" (UID: \"c8213a86-228c-4b9a-b5cb-c124dfc85613\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:53.776191 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.776163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dff28\" (UniqueName: \"kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28\") pod \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" (UID: \"c8213a86-228c-4b9a-b5cb-c124dfc85613\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:53.916212 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:53.916188 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:54.031574 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:54.031545 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq"] Apr 20 21:56:54.034653 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:56:54.034623 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8213a86_228c_4b9a_b5cb_c124dfc85613.slice/crio-74f80967bdcb9b6b960d1470d0a60505d3c9439cfa9872283ce78f035b754c4a WatchSource:0}: Error finding container 74f80967bdcb9b6b960d1470d0a60505d3c9439cfa9872283ce78f035b754c4a: Status 404 returned error can't find the container with id 74f80967bdcb9b6b960d1470d0a60505d3c9439cfa9872283ce78f035b754c4a Apr 20 21:56:54.064208 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:54.064177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" event={"ID":"fd490b9a-98e4-47fd-bb76-888c85f17749","Type":"ContainerStarted","Data":"f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3"} Apr 20 21:56:54.064323 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:54.064255 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:56:54.065251 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:54.065231 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" event={"ID":"c8213a86-228c-4b9a-b5cb-c124dfc85613","Type":"ContainerStarted","Data":"74f80967bdcb9b6b960d1470d0a60505d3c9439cfa9872283ce78f035b754c4a"} Apr 20 21:56:54.080758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:54.080721 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" podStartSLOduration=1.6266050239999998 podStartE2EDuration="6.080709679s" podCreationTimestamp="2026-04-20 21:56:48 +0000 UTC" firstStartedPulling="2026-04-20 21:56:49.404668577 +0000 UTC m=+579.565522696" lastFinishedPulling="2026-04-20 21:56:53.85877322 +0000 UTC m=+584.019627351" observedRunningTime="2026-04-20 21:56:54.080400229 +0000 UTC m=+584.241254367" watchObservedRunningTime="2026-04-20 21:56:54.080709679 +0000 UTC m=+584.241563850" Apr 20 21:56:56.072336 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:56.072260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" event={"ID":"c8213a86-228c-4b9a-b5cb-c124dfc85613","Type":"ContainerStarted","Data":"fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab"} Apr 20 21:56:56.072731 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:56.072412 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:56:56.094668 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:56:56.094620 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" podStartSLOduration=1.443883454 podStartE2EDuration="3.094604552s" podCreationTimestamp="2026-04-20 21:56:53 +0000 UTC" firstStartedPulling="2026-04-20 21:56:54.036338476 +0000 UTC m=+584.197192594" lastFinishedPulling="2026-04-20 21:56:55.687059574 +0000 UTC m=+585.847913692" observedRunningTime="2026-04-20 21:56:56.093975694 +0000 UTC m=+586.254829834" watchObservedRunningTime="2026-04-20 21:56:56.094604552 +0000 UTC m=+586.255458694" Apr 20 21:57:05.070448 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:05.070416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:57:06.756158 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.756116 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4"] Apr 20 21:57:06.756612 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.756337 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" containerName="manager" containerID="cri-o://f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3" gracePeriod=2 Apr 20 21:57:06.764510 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.764263 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4"] Apr 20 21:57:06.779324 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.779299 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq"] Apr 20 21:57:06.779588 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.779560 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" containerName="manager" containerID="cri-o://fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab" gracePeriod=2 Apr 20 21:57:06.781162 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.781134 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.781604 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.781584 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:57:06.785354 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.785323 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:06.785790 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.785768 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" containerName="manager" Apr 20 21:57:06.785790 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.785790 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" containerName="manager" Apr 20 21:57:06.785897 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.785842 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" containerName="manager" Apr 20 21:57:06.788414 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.788382 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.788570 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.788556 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.793265 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.793215 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq"] Apr 20 21:57:06.803090 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.803068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:06.806526 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.806505 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94"] Apr 20 21:57:06.806923 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.806904 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" containerName="manager" Apr 20 21:57:06.806923 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.806924 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" containerName="manager" Apr 20 21:57:06.807051 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.806986 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" containerName="manager" Apr 20 21:57:06.809580 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.809565 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:06.825336 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.825301 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.825541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.825466 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94"] Apr 20 21:57:06.826740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.826708 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.828158 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.828135 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.829859 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.829832 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:06.863475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.863434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.863475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.863469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h66v\" (UniqueName: \"kubernetes.io/projected/29b22937-44fb-43ee-8c4d-ceb0a606443c-kube-api-access-6h66v\") pod \"limitador-operator-controller-manager-85c4996f8c-2hn94\" (UID: \"29b22937-44fb-43ee-8c4d-ceb0a606443c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:06.863639 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.863564 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4k2\" (UniqueName: \"kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.964302 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.964266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4k2\" (UniqueName: \"kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.964475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.964362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.964475 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.964427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h66v\" (UniqueName: \"kubernetes.io/projected/29b22937-44fb-43ee-8c4d-ceb0a606443c-kube-api-access-6h66v\") pod \"limitador-operator-controller-manager-85c4996f8c-2hn94\" (UID: \"29b22937-44fb-43ee-8c4d-ceb0a606443c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:06.964757 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.964733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.975295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.975265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h66v\" (UniqueName: \"kubernetes.io/projected/29b22937-44fb-43ee-8c4d-ceb0a606443c-kube-api-access-6h66v\") pod \"limitador-operator-controller-manager-85c4996f8c-2hn94\" (UID: \"29b22937-44fb-43ee-8c4d-ceb0a606443c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:06.978280 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.978254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4k2\" (UniqueName: \"kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mvf5f\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:06.997060 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.997039 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:57:06.998888 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:06.998856 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.000144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.000115 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.004420 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.004404 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:57:07.005789 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.005769 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.006973 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.006931 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.065217 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.065188 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzq66\" (UniqueName: \"kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66\") pod \"fd490b9a-98e4-47fd-bb76-888c85f17749\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " Apr 20 21:57:07.065217 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.065224 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dff28\" (UniqueName: \"kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28\") pod \"c8213a86-228c-4b9a-b5cb-c124dfc85613\" (UID: \"c8213a86-228c-4b9a-b5cb-c124dfc85613\") " Apr 20 21:57:07.065438 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.065273 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume\") pod \"fd490b9a-98e4-47fd-bb76-888c85f17749\" (UID: \"fd490b9a-98e4-47fd-bb76-888c85f17749\") " Apr 20 21:57:07.065848 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.065820 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "fd490b9a-98e4-47fd-bb76-888c85f17749" (UID: "fd490b9a-98e4-47fd-bb76-888c85f17749"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:57:07.067305 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.067286 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66" (OuterVolumeSpecName: "kube-api-access-zzq66") pod "fd490b9a-98e4-47fd-bb76-888c85f17749" (UID: "fd490b9a-98e4-47fd-bb76-888c85f17749"). InnerVolumeSpecName "kube-api-access-zzq66". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:57:07.067410 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.067317 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28" (OuterVolumeSpecName: "kube-api-access-dff28") pod "c8213a86-228c-4b9a-b5cb-c124dfc85613" (UID: "c8213a86-228c-4b9a-b5cb-c124dfc85613"). InnerVolumeSpecName "kube-api-access-dff28". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:57:07.105016 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.104981 2574 generic.go:358] "Generic (PLEG): container finished" podID="fd490b9a-98e4-47fd-bb76-888c85f17749" containerID="f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3" exitCode=0 Apr 20 21:57:07.105134 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.105033 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" Apr 20 21:57:07.105134 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.105065 2574 scope.go:117] "RemoveContainer" containerID="f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3" Apr 20 21:57:07.106233 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.106210 2574 generic.go:358] "Generic (PLEG): container finished" podID="c8213a86-228c-4b9a-b5cb-c124dfc85613" containerID="fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab" exitCode=0 Apr 20 21:57:07.106310 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.106260 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" Apr 20 21:57:07.107171 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.107147 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.108706 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.108669 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.110153 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.110127 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.111507 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.111482 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.113214 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.113193 2574 scope.go:117] "RemoveContainer" containerID="f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3" Apr 20 21:57:07.113498 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:57:07.113474 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3\": container with ID starting with f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3 not found: ID does not exist" containerID="f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3" Apr 20 21:57:07.113580 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.113501 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3"} err="failed to get container status \"f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3\": rpc error: code = NotFound desc = could not find container \"f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3\": container with ID starting with f720b58c5cb44a35c63ff0da68dc4c80b33cd736637c4b5456d69bd037acabf3 not found: ID does not exist" Apr 20 21:57:07.113580 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.113522 2574 scope.go:117] "RemoveContainer" containerID="fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab" Apr 20 21:57:07.118063 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.118032 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.119553 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.119531 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.120518 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.120504 2574 scope.go:117] "RemoveContainer" containerID="fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab" Apr 20 21:57:07.120743 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:57:07.120723 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab\": container with ID starting with fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab not found: ID does not exist" containerID="fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab" Apr 20 21:57:07.120784 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.120749 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab"} err="failed to get container status \"fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab\": rpc error: code = NotFound desc = could not find container \"fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab\": container with ID starting with fb46f5944f5f8b19a96193265fa0ba2ceac354e2c123b54a28c9053dd2b415ab not found: ID does not exist" Apr 20 21:57:07.120877 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.120861 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.122091 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.122075 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:07.166579 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.166548 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fd490b9a-98e4-47fd-bb76-888c85f17749-extensions-socket-volume\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:07.166579 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.166574 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzq66\" (UniqueName: \"kubernetes.io/projected/fd490b9a-98e4-47fd-bb76-888c85f17749-kube-api-access-zzq66\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:07.166733 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.166588 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dff28\" (UniqueName: \"kubernetes.io/projected/c8213a86-228c-4b9a-b5cb-c124dfc85613-kube-api-access-dff28\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:07.183477 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.183454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:07.189141 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.189126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:07.318862 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.318777 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:07.322850 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:07.322824 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2f105a_93ca_4859_a22c_e9a5bce5395d.slice/crio-330454de7033f5212291386ff6f6777c4763c17324e5358fc15a7fa9fc05b9f6 WatchSource:0}: Error finding container 330454de7033f5212291386ff6f6777c4763c17324e5358fc15a7fa9fc05b9f6: Status 404 returned error can't find the container with id 330454de7033f5212291386ff6f6777c4763c17324e5358fc15a7fa9fc05b9f6 Apr 20 21:57:07.331394 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:07.331331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94"] Apr 20 21:57:07.336144 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:07.336119 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b22937_44fb_43ee_8c4d_ceb0a606443c.slice/crio-5016aa8df77417e57b21c074e7c9fe528e1fedbfa19fb7f793f8c63794088d26 WatchSource:0}: Error finding container 5016aa8df77417e57b21c074e7c9fe528e1fedbfa19fb7f793f8c63794088d26: Status 404 returned error can't find the container with id 5016aa8df77417e57b21c074e7c9fe528e1fedbfa19fb7f793f8c63794088d26 Apr 20 21:57:08.111162 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.111123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" event={"ID":"29b22937-44fb-43ee-8c4d-ceb0a606443c","Type":"ContainerStarted","Data":"af9fc7778ad0abd52546111dfd664b55c59cb755a0915854dd02ea3dcde1ec24"} Apr 20 21:57:08.111643 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.111169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" event={"ID":"29b22937-44fb-43ee-8c4d-ceb0a606443c","Type":"ContainerStarted","Data":"5016aa8df77417e57b21c074e7c9fe528e1fedbfa19fb7f793f8c63794088d26"} Apr 20 21:57:08.111643 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.111207 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:08.112907 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.112873 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:08.113569 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.113543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" event={"ID":"da2f105a-93ca-4859-a22c-e9a5bce5395d","Type":"ContainerStarted","Data":"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69"} Apr 20 21:57:08.113667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.113576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" event={"ID":"da2f105a-93ca-4859-a22c-e9a5bce5395d","Type":"ContainerStarted","Data":"330454de7033f5212291386ff6f6777c4763c17324e5358fc15a7fa9fc05b9f6"} Apr 20 21:57:08.113667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.113618 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:08.114411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.114388 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:08.130274 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.130228 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" podStartSLOduration=2.13021585 podStartE2EDuration="2.13021585s" podCreationTimestamp="2026-04-20 21:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:57:08.129178357 +0000 UTC m=+598.290032497" watchObservedRunningTime="2026-04-20 21:57:08.13021585 +0000 UTC m=+598.291069990" Apr 20 21:57:08.149762 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.149721 2574 status_manager.go:895] "Failed to get status for pod" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-dn8c4" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-dn8c4\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:08.150218 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.150175 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" podStartSLOduration=2.150143332 podStartE2EDuration="2.150143332s" podCreationTimestamp="2026-04-20 21:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:57:08.148069639 +0000 UTC m=+598.308923779" watchObservedRunningTime="2026-04-20 21:57:08.150143332 +0000 UTC m=+598.310997473" Apr 20 21:57:08.151041 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.151020 2574 status_manager.go:895] "Failed to get status for pod" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-4z4xq" err="pods \"limitador-operator-controller-manager-85c4996f8c-4z4xq\" is forbidden: User \"system:node:ip-10-0-140-110.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-110.ec2.internal' and this object" Apr 20 21:57:08.431207 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.431132 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8213a86-228c-4b9a-b5cb-c124dfc85613" path="/var/lib/kubelet/pods/c8213a86-228c-4b9a-b5cb-c124dfc85613/volumes" Apr 20 21:57:08.431482 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:08.431468 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd490b9a-98e4-47fd-bb76-888c85f17749" path="/var/lib/kubelet/pods/fd490b9a-98e4-47fd-bb76-888c85f17749/volumes" Apr 20 21:57:10.341684 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:10.341659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:57:10.342348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:10.342331 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 21:57:19.119185 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:19.119154 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hn94" Apr 20 21:57:19.119598 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:19.119210 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:23.885272 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:23.885238 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:23.885736 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:23.885543 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" podUID="da2f105a-93ca-4859-a22c-e9a5bce5395d" containerName="manager" containerID="cri-o://ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69" gracePeriod=10 Apr 20 21:57:24.132014 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.131990 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:24.175106 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.175019 2574 generic.go:358] "Generic (PLEG): container finished" podID="da2f105a-93ca-4859-a22c-e9a5bce5395d" containerID="ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69" exitCode=0 Apr 20 21:57:24.175106 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.175094 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" event={"ID":"da2f105a-93ca-4859-a22c-e9a5bce5395d","Type":"ContainerDied","Data":"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69"} Apr 20 21:57:24.175324 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.175104 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" Apr 20 21:57:24.175324 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.175130 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f" event={"ID":"da2f105a-93ca-4859-a22c-e9a5bce5395d","Type":"ContainerDied","Data":"330454de7033f5212291386ff6f6777c4763c17324e5358fc15a7fa9fc05b9f6"} Apr 20 21:57:24.175324 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.175149 2574 scope.go:117] "RemoveContainer" containerID="ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69" Apr 20 21:57:24.183106 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.183090 2574 scope.go:117] "RemoveContainer" containerID="ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69" Apr 20 21:57:24.183336 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:57:24.183319 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69\": container with ID starting with ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69 not found: ID does not exist" containerID="ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69" Apr 20 21:57:24.183407 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.183343 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69"} err="failed to get container status \"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69\": rpc error: code = NotFound desc = could not find container \"ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69\": container with ID starting with ba309bc5c459f8de420c037ef7af0efced35f8efc0f50e5b3195e5be97969e69 not found: ID does not exist" Apr 20 21:57:24.295153 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.295112 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume\") pod \"da2f105a-93ca-4859-a22c-e9a5bce5395d\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " Apr 20 21:57:24.295331 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.295191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf4k2\" (UniqueName: \"kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2\") pod \"da2f105a-93ca-4859-a22c-e9a5bce5395d\" (UID: \"da2f105a-93ca-4859-a22c-e9a5bce5395d\") " Apr 20 21:57:24.295573 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.295546 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "da2f105a-93ca-4859-a22c-e9a5bce5395d" (UID: "da2f105a-93ca-4859-a22c-e9a5bce5395d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:57:24.297226 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.297200 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2" (OuterVolumeSpecName: "kube-api-access-nf4k2") pod "da2f105a-93ca-4859-a22c-e9a5bce5395d" (UID: "da2f105a-93ca-4859-a22c-e9a5bce5395d"). InnerVolumeSpecName "kube-api-access-nf4k2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:57:24.395718 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.395688 2574 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/da2f105a-93ca-4859-a22c-e9a5bce5395d-extensions-socket-volume\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:24.395718 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.395718 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nf4k2\" (UniqueName: \"kubernetes.io/projected/da2f105a-93ca-4859-a22c-e9a5bce5395d-kube-api-access-nf4k2\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:24.491908 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.491882 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:24.498073 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:24.498050 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mvf5f"] Apr 20 21:57:26.431432 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:26.431400 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2f105a-93ca-4859-a22c-e9a5bce5395d" path="/var/lib/kubelet/pods/da2f105a-93ca-4859-a22c-e9a5bce5395d/volumes" Apr 20 21:57:40.135666 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.135631 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64"] Apr 20 21:57:40.136120 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.136014 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da2f105a-93ca-4859-a22c-e9a5bce5395d" containerName="manager" Apr 20 21:57:40.136120 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.136031 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2f105a-93ca-4859-a22c-e9a5bce5395d" containerName="manager" Apr 20 21:57:40.136120 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.136091 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="da2f105a-93ca-4859-a22c-e9a5bce5395d" containerName="manager" Apr 20 21:57:40.139021 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.138995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.141529 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.141219 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-9ql96\"" Apr 20 21:57:40.152731 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.152710 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64"] Apr 20 21:57:40.317141 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317141 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b3c675b-1eb0-4857-8212-055e3a3de56b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317411 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317416 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj5zc\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-kube-api-access-zj5zc\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.317632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.317487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418676 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zj5zc\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-kube-api-access-zj5zc\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418676 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418676 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b3c675b-1eb0-4857-8212-055e3a3de56b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.418929 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.418827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.419224 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.419099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.419224 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.419175 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.419335 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.419243 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.419335 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.419274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.419738 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.419715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b3c675b-1eb0-4857-8212-055e3a3de56b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.421049 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.421023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.421178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.421159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.425328 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.425307 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.425512 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.425495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj5zc\" (UniqueName: \"kubernetes.io/projected/2b3c675b-1eb0-4857-8212-055e3a3de56b-kube-api-access-zj5zc\") pod \"maas-default-gateway-openshift-default-58b6f876-8rs64\" (UID: \"2b3c675b-1eb0-4857-8212-055e3a3de56b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.449935 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.449912 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:40.570869 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.570838 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64"] Apr 20 21:57:40.575170 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:40.575142 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b3c675b_1eb0_4857_8212_055e3a3de56b.slice/crio-b1af3b2c59f164b9dd76203ad43803d1df86e72fea4a8194f3578bdac14ef400 WatchSource:0}: Error finding container b1af3b2c59f164b9dd76203ad43803d1df86e72fea4a8194f3578bdac14ef400: Status 404 returned error can't find the container with id b1af3b2c59f164b9dd76203ad43803d1df86e72fea4a8194f3578bdac14ef400 Apr 20 21:57:40.577240 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.577205 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:57:40.577327 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.577271 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:57:40.577327 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:40.577298 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 21:57:41.228480 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:41.228431 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" event={"ID":"2b3c675b-1eb0-4857-8212-055e3a3de56b","Type":"ContainerStarted","Data":"c5b85f1283330c048f72e3abf658af018772891992b63645fd260538f6d3d320"} Apr 20 21:57:41.228480 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:41.228479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" event={"ID":"2b3c675b-1eb0-4857-8212-055e3a3de56b","Type":"ContainerStarted","Data":"b1af3b2c59f164b9dd76203ad43803d1df86e72fea4a8194f3578bdac14ef400"} Apr 20 21:57:41.265382 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:41.265313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" podStartSLOduration=1.26529842 podStartE2EDuration="1.26529842s" podCreationTimestamp="2026-04-20 21:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:57:41.262699951 +0000 UTC m=+631.423554092" watchObservedRunningTime="2026-04-20 21:57:41.26529842 +0000 UTC m=+631.426152622" Apr 20 21:57:41.450094 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:41.450065 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:41.454490 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:41.454466 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:42.231724 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:42.231691 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:42.232612 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:42.232595 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-8rs64" Apr 20 21:57:44.441179 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.441144 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:57:44.444548 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.444531 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.444951 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.444923 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmv8w\" (UniqueName: \"kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.445062 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.444966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.446463 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.446438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pljh2\"" Apr 20 21:57:44.446563 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.446438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 21:57:44.452709 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.452688 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:57:44.543823 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.543791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:57:44.545816 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.545791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmv8w\" (UniqueName: \"kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.545943 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.545837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.546540 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.546522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.555825 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.555797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmv8w\" (UniqueName: \"kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w\") pod \"limitador-limitador-7d549b5b-m46xv\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.755984 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.755908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:44.870121 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:44.869971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:57:44.872691 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:44.872664 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ec7151_d666_4132_aa85_6cd2da29d787.slice/crio-077f523dcf2fbd9630eab5c9f023935ddf0884db3568c2e9d99d3d09a9470097 WatchSource:0}: Error finding container 077f523dcf2fbd9630eab5c9f023935ddf0884db3568c2e9d99d3d09a9470097: Status 404 returned error can't find the container with id 077f523dcf2fbd9630eab5c9f023935ddf0884db3568c2e9d99d3d09a9470097 Apr 20 21:57:45.242545 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.242507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" event={"ID":"70ec7151-d666-4132-aa85-6cd2da29d787","Type":"ContainerStarted","Data":"077f523dcf2fbd9630eab5c9f023935ddf0884db3568c2e9d99d3d09a9470097"} Apr 20 21:57:45.245442 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.245415 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:45.249823 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.249802 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:45.251172 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.251152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgv8w\" (UniqueName: \"kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w\") pod \"authorino-f99f4b5cd-pgzwv\" (UID: \"ba8b73d8-9c70-4740-89ed-11a7d98170a8\") " pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:45.251632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.251616 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-d7ncl\"" Apr 20 21:57:45.254627 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.254606 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:45.351764 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.351731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgv8w\" (UniqueName: \"kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w\") pod \"authorino-f99f4b5cd-pgzwv\" (UID: \"ba8b73d8-9c70-4740-89ed-11a7d98170a8\") " pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:45.359036 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.359015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgv8w\" (UniqueName: \"kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w\") pod \"authorino-f99f4b5cd-pgzwv\" (UID: \"ba8b73d8-9c70-4740-89ed-11a7d98170a8\") " pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:45.436643 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.436614 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:57:45.439744 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.439722 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:57:45.444348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.444320 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:57:45.452948 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.452916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpcr7\" (UniqueName: \"kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7\") pod \"authorino-7498df8756-6s5hc\" (UID: \"354f3712-7503-4276-9013-68b1577ca6fb\") " pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:57:45.553769 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.553692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpcr7\" (UniqueName: \"kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7\") pod \"authorino-7498df8756-6s5hc\" (UID: \"354f3712-7503-4276-9013-68b1577ca6fb\") " pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:57:45.559824 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.559794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:45.561193 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.561171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpcr7\" (UniqueName: \"kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7\") pod \"authorino-7498df8756-6s5hc\" (UID: \"354f3712-7503-4276-9013-68b1577ca6fb\") " pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:57:45.723095 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.722437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:45.731014 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:45.730972 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba8b73d8_9c70_4740_89ed_11a7d98170a8.slice/crio-2247663724f896845d9c4f768013792db1408808d725529812052e50f9b2b973 WatchSource:0}: Error finding container 2247663724f896845d9c4f768013792db1408808d725529812052e50f9b2b973: Status 404 returned error can't find the container with id 2247663724f896845d9c4f768013792db1408808d725529812052e50f9b2b973 Apr 20 21:57:45.750889 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.750853 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:57:45.906603 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:45.906576 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:57:45.909271 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:57:45.909243 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod354f3712_7503_4276_9013_68b1577ca6fb.slice/crio-8fb3f5117ceb8bb6f8c97cfce4f6f5a6bf21128b4a9c13c14857c56ce00ac74c WatchSource:0}: Error finding container 8fb3f5117ceb8bb6f8c97cfce4f6f5a6bf21128b4a9c13c14857c56ce00ac74c: Status 404 returned error can't find the container with id 8fb3f5117ceb8bb6f8c97cfce4f6f5a6bf21128b4a9c13c14857c56ce00ac74c Apr 20 21:57:46.246913 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:46.246873 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" event={"ID":"ba8b73d8-9c70-4740-89ed-11a7d98170a8","Type":"ContainerStarted","Data":"2247663724f896845d9c4f768013792db1408808d725529812052e50f9b2b973"} Apr 20 21:57:46.248029 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:46.248003 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6s5hc" event={"ID":"354f3712-7503-4276-9013-68b1577ca6fb","Type":"ContainerStarted","Data":"8fb3f5117ceb8bb6f8c97cfce4f6f5a6bf21128b4a9c13c14857c56ce00ac74c"} Apr 20 21:57:50.265827 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.265787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6s5hc" event={"ID":"354f3712-7503-4276-9013-68b1577ca6fb","Type":"ContainerStarted","Data":"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d"} Apr 20 21:57:50.267183 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.267156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" event={"ID":"ba8b73d8-9c70-4740-89ed-11a7d98170a8","Type":"ContainerStarted","Data":"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3"} Apr 20 21:57:50.268417 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.268398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" event={"ID":"70ec7151-d666-4132-aa85-6cd2da29d787","Type":"ContainerStarted","Data":"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31"} Apr 20 21:57:50.268522 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.268514 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:57:50.278781 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.278736 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-6s5hc" podStartSLOduration=1.615324356 podStartE2EDuration="5.278725322s" podCreationTimestamp="2026-04-20 21:57:45 +0000 UTC" firstStartedPulling="2026-04-20 21:57:45.9108633 +0000 UTC m=+636.071717423" lastFinishedPulling="2026-04-20 21:57:49.574264267 +0000 UTC m=+639.735118389" observedRunningTime="2026-04-20 21:57:50.277557093 +0000 UTC m=+640.438411236" watchObservedRunningTime="2026-04-20 21:57:50.278725322 +0000 UTC m=+640.439579469" Apr 20 21:57:50.290768 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.290724 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" podStartSLOduration=1.453155145 podStartE2EDuration="5.290711341s" podCreationTimestamp="2026-04-20 21:57:45 +0000 UTC" firstStartedPulling="2026-04-20 21:57:45.732544049 +0000 UTC m=+635.893398172" lastFinishedPulling="2026-04-20 21:57:49.570100245 +0000 UTC m=+639.730954368" observedRunningTime="2026-04-20 21:57:50.289785662 +0000 UTC m=+640.450639802" watchObservedRunningTime="2026-04-20 21:57:50.290711341 +0000 UTC m=+640.451565504" Apr 20 21:57:50.305190 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.305132 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" podStartSLOduration=1.608870062 podStartE2EDuration="6.305113998s" podCreationTimestamp="2026-04-20 21:57:44 +0000 UTC" firstStartedPulling="2026-04-20 21:57:44.87434155 +0000 UTC m=+635.035195668" lastFinishedPulling="2026-04-20 21:57:49.57058547 +0000 UTC m=+639.731439604" observedRunningTime="2026-04-20 21:57:50.303452096 +0000 UTC m=+640.464306236" watchObservedRunningTime="2026-04-20 21:57:50.305113998 +0000 UTC m=+640.465968139" Apr 20 21:57:50.315906 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:50.315879 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:52.274238 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:52.274198 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" podUID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" containerName="authorino" containerID="cri-o://e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3" gracePeriod=30 Apr 20 21:57:52.510934 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:52.510912 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:52.604908 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:52.604879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgv8w\" (UniqueName: \"kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w\") pod \"ba8b73d8-9c70-4740-89ed-11a7d98170a8\" (UID: \"ba8b73d8-9c70-4740-89ed-11a7d98170a8\") " Apr 20 21:57:52.606832 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:52.606808 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w" (OuterVolumeSpecName: "kube-api-access-qgv8w") pod "ba8b73d8-9c70-4740-89ed-11a7d98170a8" (UID: "ba8b73d8-9c70-4740-89ed-11a7d98170a8"). InnerVolumeSpecName "kube-api-access-qgv8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:57:52.706110 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:52.706074 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgv8w\" (UniqueName: \"kubernetes.io/projected/ba8b73d8-9c70-4740-89ed-11a7d98170a8-kube-api-access-qgv8w\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:57:53.278317 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.278280 2574 generic.go:358] "Generic (PLEG): container finished" podID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" containerID="e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3" exitCode=0 Apr 20 21:57:53.278317 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.278318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" event={"ID":"ba8b73d8-9c70-4740-89ed-11a7d98170a8","Type":"ContainerDied","Data":"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3"} Apr 20 21:57:53.278798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.278341 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" event={"ID":"ba8b73d8-9c70-4740-89ed-11a7d98170a8","Type":"ContainerDied","Data":"2247663724f896845d9c4f768013792db1408808d725529812052e50f9b2b973"} Apr 20 21:57:53.278798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.278355 2574 scope.go:117] "RemoveContainer" containerID="e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3" Apr 20 21:57:53.278798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.278316 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-pgzwv" Apr 20 21:57:53.287208 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.287190 2574 scope.go:117] "RemoveContainer" containerID="e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3" Apr 20 21:57:53.287466 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:57:53.287447 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3\": container with ID starting with e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3 not found: ID does not exist" containerID="e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3" Apr 20 21:57:53.287522 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.287474 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3"} err="failed to get container status \"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3\": rpc error: code = NotFound desc = could not find container \"e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3\": container with ID starting with e18e65ac1d8bf0a141fe3a46b3c5940c15f23035f17d2cb0949800e6891b5de3 not found: ID does not exist" Apr 20 21:57:53.298295 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.298271 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:53.299698 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:53.299680 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-pgzwv"] Apr 20 21:57:54.431428 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:54.431397 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" path="/var/lib/kubelet/pods/ba8b73d8-9c70-4740-89ed-11a7d98170a8/volumes" Apr 20 21:57:59.705177 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:59.705137 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:57:59.705717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:59.705407 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" podUID="70ec7151-d666-4132-aa85-6cd2da29d787" containerName="limitador" containerID="cri-o://d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31" gracePeriod=30 Apr 20 21:57:59.706072 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:57:59.706044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:58:00.243845 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.243822 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:58:00.264992 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.264964 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmv8w\" (UniqueName: \"kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w\") pod \"70ec7151-d666-4132-aa85-6cd2da29d787\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " Apr 20 21:58:00.265154 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.265038 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file\") pod \"70ec7151-d666-4132-aa85-6cd2da29d787\" (UID: \"70ec7151-d666-4132-aa85-6cd2da29d787\") " Apr 20 21:58:00.265523 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.265455 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file" (OuterVolumeSpecName: "config-file") pod "70ec7151-d666-4132-aa85-6cd2da29d787" (UID: "70ec7151-d666-4132-aa85-6cd2da29d787"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:58:00.267087 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.267058 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w" (OuterVolumeSpecName: "kube-api-access-wmv8w") pod "70ec7151-d666-4132-aa85-6cd2da29d787" (UID: "70ec7151-d666-4132-aa85-6cd2da29d787"). InnerVolumeSpecName "kube-api-access-wmv8w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:00.299870 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.299839 2574 generic.go:358] "Generic (PLEG): container finished" podID="70ec7151-d666-4132-aa85-6cd2da29d787" containerID="d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31" exitCode=0 Apr 20 21:58:00.299989 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.299881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" event={"ID":"70ec7151-d666-4132-aa85-6cd2da29d787","Type":"ContainerDied","Data":"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31"} Apr 20 21:58:00.299989 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.299908 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" event={"ID":"70ec7151-d666-4132-aa85-6cd2da29d787","Type":"ContainerDied","Data":"077f523dcf2fbd9630eab5c9f023935ddf0884db3568c2e9d99d3d09a9470097"} Apr 20 21:58:00.299989 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.299910 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m46xv" Apr 20 21:58:00.299989 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.299924 2574 scope.go:117] "RemoveContainer" containerID="d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31" Apr 20 21:58:00.307392 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.307355 2574 scope.go:117] "RemoveContainer" containerID="d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31" Apr 20 21:58:00.307737 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:00.307639 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31\": container with ID starting with d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31 not found: ID does not exist" containerID="d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31" Apr 20 21:58:00.307737 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.307676 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31"} err="failed to get container status \"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31\": rpc error: code = NotFound desc = could not find container \"d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31\": container with ID starting with d6fd657120e9044f2659cff2e2328986d4eac96d77cbded83753da8649e40d31 not found: ID does not exist" Apr 20 21:58:00.319777 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.319751 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:58:00.322797 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.322778 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m46xv"] Apr 20 21:58:00.365973 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.365944 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/70ec7151-d666-4132-aa85-6cd2da29d787-config-file\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:00.365973 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.365966 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wmv8w\" (UniqueName: \"kubernetes.io/projected/70ec7151-d666-4132-aa85-6cd2da29d787-kube-api-access-wmv8w\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:00.430833 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:00.430800 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ec7151-d666-4132-aa85-6cd2da29d787" path="/var/lib/kubelet/pods/70ec7151-d666-4132-aa85-6cd2da29d787/volumes" Apr 20 21:58:17.615299 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615266 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615533 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70ec7151-d666-4132-aa85-6cd2da29d787" containerName="limitador" Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615544 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ec7151-d666-4132-aa85-6cd2da29d787" containerName="limitador" Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615558 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" containerName="authorino" Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615563 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" containerName="authorino" Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615607 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba8b73d8-9c70-4740-89ed-11a7d98170a8" containerName="authorino" Apr 20 21:58:17.615696 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.615618 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="70ec7151-d666-4132-aa85-6cd2da29d787" containerName="limitador" Apr 20 21:58:17.635502 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.635465 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:17.635648 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.635601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:17.698342 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.698311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gp6\" (UniqueName: \"kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6\") pod \"authorino-8b475cf9f-5w6z7\" (UID: \"973aaa97-399f-4499-9d7d-99bd88751986\") " pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:17.799111 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.799078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gp6\" (UniqueName: \"kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6\") pod \"authorino-8b475cf9f-5w6z7\" (UID: \"973aaa97-399f-4499-9d7d-99bd88751986\") " pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:17.806870 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.806840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gp6\" (UniqueName: \"kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6\") pod \"authorino-8b475cf9f-5w6z7\" (UID: \"973aaa97-399f-4499-9d7d-99bd88751986\") " pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:17.848617 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.848592 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:17.848800 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.848789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:17.875386 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.875309 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-57b8fdb5cb-cm9bf"] Apr 20 21:58:17.880129 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.880107 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:17.890488 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.890396 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57b8fdb5cb-cm9bf"] Apr 20 21:58:17.904609 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.904585 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b8fdb5cb-cm9bf"] Apr 20 21:58:17.904819 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:17.904797 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vt77z], unattached volumes=[], failed to process volumes=[kube-api-access-vt77z]: context canceled" pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" podUID="0ae63f63-f980-4bb1-b11a-21fe92c4308e" Apr 20 21:58:17.948174 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.947787 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 21:58:17.951171 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.951148 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:17.954225 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.954205 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 21:58:17.962159 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.962134 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 21:58:17.987801 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:17.987780 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:17.990079 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:17.990052 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod973aaa97_399f_4499_9d7d_99bd88751986.slice/crio-68081af8a979eb2cb4160809c169a1f662d7fe7fb0b8edf51ae12234ac3a52cb WatchSource:0}: Error finding container 68081af8a979eb2cb4160809c169a1f662d7fe7fb0b8edf51ae12234ac3a52cb: Status 404 returned error can't find the container with id 68081af8a979eb2cb4160809c169a1f662d7fe7fb0b8edf51ae12234ac3a52cb Apr 20 21:58:18.000924 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.000904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk5jc\" (UniqueName: \"kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.001020 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.000938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.001020 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.000963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt77z\" (UniqueName: \"kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z\") pod \"authorino-57b8fdb5cb-cm9bf\" (UID: \"0ae63f63-f980-4bb1-b11a-21fe92c4308e\") " pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:18.102091 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.102056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk5jc\" (UniqueName: \"kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.102255 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.102102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.102255 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.102138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt77z\" (UniqueName: \"kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z\") pod \"authorino-57b8fdb5cb-cm9bf\" (UID: \"0ae63f63-f980-4bb1-b11a-21fe92c4308e\") " pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:18.104541 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.104518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.109918 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.109891 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk5jc\" (UniqueName: \"kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc\") pod \"authorino-bb8f8449b-txvkq\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.110042 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.109961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt77z\" (UniqueName: \"kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z\") pod \"authorino-57b8fdb5cb-cm9bf\" (UID: \"0ae63f63-f980-4bb1-b11a-21fe92c4308e\") " pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:18.260172 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.260077 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 21:58:18.358020 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.357973 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:18.358172 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.357966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" event={"ID":"973aaa97-399f-4499-9d7d-99bd88751986","Type":"ContainerStarted","Data":"68081af8a979eb2cb4160809c169a1f662d7fe7fb0b8edf51ae12234ac3a52cb"} Apr 20 21:58:18.363308 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.363290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:18.379032 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.379008 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 21:58:18.381816 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:18.381790 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a3ee233_4a16_4f4e_b4af_c93a965da896.slice/crio-7ae6537571630a797c4eac23882ac7dbd8e2e5485f74732c5ea90fd581033e0c WatchSource:0}: Error finding container 7ae6537571630a797c4eac23882ac7dbd8e2e5485f74732c5ea90fd581033e0c: Status 404 returned error can't find the container with id 7ae6537571630a797c4eac23882ac7dbd8e2e5485f74732c5ea90fd581033e0c Apr 20 21:58:18.405419 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.405391 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt77z\" (UniqueName: \"kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z\") pod \"0ae63f63-f980-4bb1-b11a-21fe92c4308e\" (UID: \"0ae63f63-f980-4bb1-b11a-21fe92c4308e\") " Apr 20 21:58:18.407185 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.407154 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z" (OuterVolumeSpecName: "kube-api-access-vt77z") pod "0ae63f63-f980-4bb1-b11a-21fe92c4308e" (UID: "0ae63f63-f980-4bb1-b11a-21fe92c4308e"). InnerVolumeSpecName "kube-api-access-vt77z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:18.506178 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:18.506143 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vt77z\" (UniqueName: \"kubernetes.io/projected/0ae63f63-f980-4bb1-b11a-21fe92c4308e-kube-api-access-vt77z\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:19.362861 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.362820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" event={"ID":"973aaa97-399f-4499-9d7d-99bd88751986","Type":"ContainerStarted","Data":"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9"} Apr 20 21:58:19.363291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.362890 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" podUID="973aaa97-399f-4499-9d7d-99bd88751986" containerName="authorino" containerID="cri-o://6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9" gracePeriod=30 Apr 20 21:58:19.364204 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.364183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b8fdb5cb-cm9bf" Apr 20 21:58:19.364299 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.364201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bb8f8449b-txvkq" event={"ID":"7a3ee233-4a16-4f4e-b4af-c93a965da896","Type":"ContainerStarted","Data":"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71"} Apr 20 21:58:19.364299 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.364229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bb8f8449b-txvkq" event={"ID":"7a3ee233-4a16-4f4e-b4af-c93a965da896","Type":"ContainerStarted","Data":"7ae6537571630a797c4eac23882ac7dbd8e2e5485f74732c5ea90fd581033e0c"} Apr 20 21:58:19.377332 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.377294 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" podStartSLOduration=2.019922403 podStartE2EDuration="2.377282941s" podCreationTimestamp="2026-04-20 21:58:17 +0000 UTC" firstStartedPulling="2026-04-20 21:58:17.991289262 +0000 UTC m=+668.152143380" lastFinishedPulling="2026-04-20 21:58:18.348649796 +0000 UTC m=+668.509503918" observedRunningTime="2026-04-20 21:58:19.375589932 +0000 UTC m=+669.536444073" watchObservedRunningTime="2026-04-20 21:58:19.377282941 +0000 UTC m=+669.538137124" Apr 20 21:58:19.388968 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.388931 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-bb8f8449b-txvkq" podStartSLOduration=2.095781589 podStartE2EDuration="2.388920104s" podCreationTimestamp="2026-04-20 21:58:17 +0000 UTC" firstStartedPulling="2026-04-20 21:58:18.383011141 +0000 UTC m=+668.543865259" lastFinishedPulling="2026-04-20 21:58:18.676149639 +0000 UTC m=+668.837003774" observedRunningTime="2026-04-20 21:58:19.387875089 +0000 UTC m=+669.548729229" watchObservedRunningTime="2026-04-20 21:58:19.388920104 +0000 UTC m=+669.549774222" Apr 20 21:58:19.408657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.408631 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b8fdb5cb-cm9bf"] Apr 20 21:58:19.414095 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.414074 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:58:19.414292 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.414264 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-6s5hc" podUID="354f3712-7503-4276-9013-68b1577ca6fb" containerName="authorino" containerID="cri-o://ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d" gracePeriod=30 Apr 20 21:58:19.416567 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.416544 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-57b8fdb5cb-cm9bf"] Apr 20 21:58:19.663777 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.663755 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:19.672736 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.672719 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:58:19.717437 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.717395 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpcr7\" (UniqueName: \"kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7\") pod \"354f3712-7503-4276-9013-68b1577ca6fb\" (UID: \"354f3712-7503-4276-9013-68b1577ca6fb\") " Apr 20 21:58:19.717576 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.717463 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9gp6\" (UniqueName: \"kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6\") pod \"973aaa97-399f-4499-9d7d-99bd88751986\" (UID: \"973aaa97-399f-4499-9d7d-99bd88751986\") " Apr 20 21:58:19.719424 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.719397 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6" (OuterVolumeSpecName: "kube-api-access-p9gp6") pod "973aaa97-399f-4499-9d7d-99bd88751986" (UID: "973aaa97-399f-4499-9d7d-99bd88751986"). InnerVolumeSpecName "kube-api-access-p9gp6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:19.719424 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.719412 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7" (OuterVolumeSpecName: "kube-api-access-rpcr7") pod "354f3712-7503-4276-9013-68b1577ca6fb" (UID: "354f3712-7503-4276-9013-68b1577ca6fb"). InnerVolumeSpecName "kube-api-access-rpcr7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:19.819035 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.819005 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9gp6\" (UniqueName: \"kubernetes.io/projected/973aaa97-399f-4499-9d7d-99bd88751986-kube-api-access-p9gp6\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:19.819035 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:19.819031 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpcr7\" (UniqueName: \"kubernetes.io/projected/354f3712-7503-4276-9013-68b1577ca6fb-kube-api-access-rpcr7\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:20.237145 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237108 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:20.237410 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237391 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="354f3712-7503-4276-9013-68b1577ca6fb" containerName="authorino" Apr 20 21:58:20.237465 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237411 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f3712-7503-4276-9013-68b1577ca6fb" containerName="authorino" Apr 20 21:58:20.237465 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237452 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="973aaa97-399f-4499-9d7d-99bd88751986" containerName="authorino" Apr 20 21:58:20.237465 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237457 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="973aaa97-399f-4499-9d7d-99bd88751986" containerName="authorino" Apr 20 21:58:20.237556 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237503 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="354f3712-7503-4276-9013-68b1577ca6fb" containerName="authorino" Apr 20 21:58:20.237556 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.237511 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="973aaa97-399f-4499-9d7d-99bd88751986" containerName="authorino" Apr 20 21:58:20.240611 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.240586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:20.242617 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.242593 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-j2fgl\"" Apr 20 21:58:20.248338 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.248317 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:20.322153 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.322119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75kw\" (UniqueName: \"kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw\") pod \"maas-controller-6d4c8f55f9-knw9z\" (UID: \"dd508b32-6a93-4a5c-8d2a-8579af3558f2\") " pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:20.368285 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.368255 2574 generic.go:358] "Generic (PLEG): container finished" podID="354f3712-7503-4276-9013-68b1577ca6fb" containerID="ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d" exitCode=0 Apr 20 21:58:20.368756 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.368305 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-6s5hc" Apr 20 21:58:20.368756 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.368325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6s5hc" event={"ID":"354f3712-7503-4276-9013-68b1577ca6fb","Type":"ContainerDied","Data":"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d"} Apr 20 21:58:20.368756 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.368363 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-6s5hc" event={"ID":"354f3712-7503-4276-9013-68b1577ca6fb","Type":"ContainerDied","Data":"8fb3f5117ceb8bb6f8c97cfce4f6f5a6bf21128b4a9c13c14857c56ce00ac74c"} Apr 20 21:58:20.368756 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.368417 2574 scope.go:117] "RemoveContainer" containerID="ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d" Apr 20 21:58:20.369673 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.369502 2574 generic.go:358] "Generic (PLEG): container finished" podID="973aaa97-399f-4499-9d7d-99bd88751986" containerID="6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9" exitCode=0 Apr 20 21:58:20.369673 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.369548 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" Apr 20 21:58:20.369673 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.369579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" event={"ID":"973aaa97-399f-4499-9d7d-99bd88751986","Type":"ContainerDied","Data":"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9"} Apr 20 21:58:20.369673 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.369607 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-5w6z7" event={"ID":"973aaa97-399f-4499-9d7d-99bd88751986","Type":"ContainerDied","Data":"68081af8a979eb2cb4160809c169a1f662d7fe7fb0b8edf51ae12234ac3a52cb"} Apr 20 21:58:20.376518 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.376499 2574 scope.go:117] "RemoveContainer" containerID="ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d" Apr 20 21:58:20.376832 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:20.376796 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d\": container with ID starting with ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d not found: ID does not exist" containerID="ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d" Apr 20 21:58:20.376952 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.376832 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d"} err="failed to get container status \"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d\": rpc error: code = NotFound desc = could not find container \"ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d\": container with ID starting with ab64d24c0864ddcacff58666e2d2cfce8029c67a8e9cecfd51242e2503a0993d not found: ID does not exist" Apr 20 21:58:20.376952 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.376852 2574 scope.go:117] "RemoveContainer" containerID="6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9" Apr 20 21:58:20.384340 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.384326 2574 scope.go:117] "RemoveContainer" containerID="6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9" Apr 20 21:58:20.384563 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:20.384547 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9\": container with ID starting with 6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9 not found: ID does not exist" containerID="6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9" Apr 20 21:58:20.384604 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.384570 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9"} err="failed to get container status \"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9\": rpc error: code = NotFound desc = could not find container \"6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9\": container with ID starting with 6cd0582a3ce20df1012b80f232abdcf1a76f4ca2e4acd9910d98fb3dc0e983c9 not found: ID does not exist" Apr 20 21:58:20.389450 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.387493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:20.392910 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.392889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:20.399358 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.399324 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:20.403207 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.403185 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:58:20.407988 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.407967 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-6s5hc"] Apr 20 21:58:20.416855 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.416834 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:20.418307 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.418290 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5w6z7"] Apr 20 21:58:20.422944 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.422925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z75kw\" (UniqueName: \"kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw\") pod \"maas-controller-6d4c8f55f9-knw9z\" (UID: \"dd508b32-6a93-4a5c-8d2a-8579af3558f2\") " pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:20.431302 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.431280 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae63f63-f980-4bb1-b11a-21fe92c4308e" path="/var/lib/kubelet/pods/0ae63f63-f980-4bb1-b11a-21fe92c4308e/volumes" Apr 20 21:58:20.431507 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.431492 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f3712-7503-4276-9013-68b1577ca6fb" path="/var/lib/kubelet/pods/354f3712-7503-4276-9013-68b1577ca6fb/volumes" Apr 20 21:58:20.431771 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.431757 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973aaa97-399f-4499-9d7d-99bd88751986" path="/var/lib/kubelet/pods/973aaa97-399f-4499-9d7d-99bd88751986/volumes" Apr 20 21:58:20.433263 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.433241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75kw\" (UniqueName: \"kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw\") pod \"maas-controller-6d4c8f55f9-knw9z\" (UID: \"dd508b32-6a93-4a5c-8d2a-8579af3558f2\") " pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:20.509265 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.509189 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:20.509435 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.509421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:20.524229 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.524203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckj5\" (UniqueName: \"kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5\") pod \"maas-controller-7b5fdb989c-xz9xk\" (UID: \"ebfe9e71-aa87-4085-a923-378ebe4fa566\") " pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:20.533388 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.533343 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:20.537802 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.537784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:20.543912 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.543888 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:20.625240 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.625210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldlk\" (UniqueName: \"kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk\") pod \"maas-controller-58cc7b9fdb-6q8k6\" (UID: \"2e592db2-d36a-458f-84e5-ac95202637fa\") " pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:20.625407 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.625282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bckj5\" (UniqueName: \"kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5\") pod \"maas-controller-7b5fdb989c-xz9xk\" (UID: \"ebfe9e71-aa87-4085-a923-378ebe4fa566\") " pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:20.634772 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.634750 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:20.636550 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.636530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckj5\" (UniqueName: \"kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5\") pod \"maas-controller-7b5fdb989c-xz9xk\" (UID: \"ebfe9e71-aa87-4085-a923-378ebe4fa566\") " pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:20.637161 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:20.637132 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd508b32_6a93_4a5c_8d2a_8579af3558f2.slice/crio-6d4d6aafd25e6e55b45e711ccd1aaacc84766ae598cb1a05ad991d821b74ba57 WatchSource:0}: Error finding container 6d4d6aafd25e6e55b45e711ccd1aaacc84766ae598cb1a05ad991d821b74ba57: Status 404 returned error can't find the container with id 6d4d6aafd25e6e55b45e711ccd1aaacc84766ae598cb1a05ad991d821b74ba57 Apr 20 21:58:20.703620 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.703591 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:20.726580 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.726554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vldlk\" (UniqueName: \"kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk\") pod \"maas-controller-58cc7b9fdb-6q8k6\" (UID: \"2e592db2-d36a-458f-84e5-ac95202637fa\") " pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:20.734805 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.734783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldlk\" (UniqueName: \"kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk\") pod \"maas-controller-58cc7b9fdb-6q8k6\" (UID: \"2e592db2-d36a-458f-84e5-ac95202637fa\") " pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:20.826311 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.826282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:20.829519 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:20.829489 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebfe9e71_aa87_4085_a923_378ebe4fa566.slice/crio-1bb1e8a81cb35670e4ae1d8e4bfe321f2bf12b9542b955a08f955c4b7baf8962 WatchSource:0}: Error finding container 1bb1e8a81cb35670e4ae1d8e4bfe321f2bf12b9542b955a08f955c4b7baf8962: Status 404 returned error can't find the container with id 1bb1e8a81cb35670e4ae1d8e4bfe321f2bf12b9542b955a08f955c4b7baf8962 Apr 20 21:58:20.849304 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.849276 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:20.966167 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:20.966144 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:20.968029 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:20.967998 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e592db2_d36a_458f_84e5_ac95202637fa.slice/crio-942bfe8a8262649d39535f3aaccb4f5d51c39be63b8a2fa442da35eac4bdb2d4 WatchSource:0}: Error finding container 942bfe8a8262649d39535f3aaccb4f5d51c39be63b8a2fa442da35eac4bdb2d4: Status 404 returned error can't find the container with id 942bfe8a8262649d39535f3aaccb4f5d51c39be63b8a2fa442da35eac4bdb2d4 Apr 20 21:58:21.377689 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:21.377647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" event={"ID":"2e592db2-d36a-458f-84e5-ac95202637fa","Type":"ContainerStarted","Data":"942bfe8a8262649d39535f3aaccb4f5d51c39be63b8a2fa442da35eac4bdb2d4"} Apr 20 21:58:21.379761 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:21.379731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" event={"ID":"dd508b32-6a93-4a5c-8d2a-8579af3558f2","Type":"ContainerStarted","Data":"6d4d6aafd25e6e55b45e711ccd1aaacc84766ae598cb1a05ad991d821b74ba57"} Apr 20 21:58:21.381968 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:21.381939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" event={"ID":"ebfe9e71-aa87-4085-a923-378ebe4fa566","Type":"ContainerStarted","Data":"1bb1e8a81cb35670e4ae1d8e4bfe321f2bf12b9542b955a08f955c4b7baf8962"} Apr 20 21:58:24.395401 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.395346 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" event={"ID":"ebfe9e71-aa87-4085-a923-378ebe4fa566","Type":"ContainerStarted","Data":"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210"} Apr 20 21:58:24.395857 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.395468 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:24.396824 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.396804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" event={"ID":"2e592db2-d36a-458f-84e5-ac95202637fa","Type":"ContainerStarted","Data":"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481"} Apr 20 21:58:24.396946 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.396933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:24.398081 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.398055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" event={"ID":"dd508b32-6a93-4a5c-8d2a-8579af3558f2","Type":"ContainerStarted","Data":"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a"} Apr 20 21:58:24.398239 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.398211 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:24.398239 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.398206 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" podUID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" containerName="manager" containerID="cri-o://cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a" gracePeriod=10 Apr 20 21:58:24.411960 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.411912 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" podStartSLOduration=1.363877657 podStartE2EDuration="4.4118965s" podCreationTimestamp="2026-04-20 21:58:20 +0000 UTC" firstStartedPulling="2026-04-20 21:58:20.830893359 +0000 UTC m=+670.991747478" lastFinishedPulling="2026-04-20 21:58:23.878912202 +0000 UTC m=+674.039766321" observedRunningTime="2026-04-20 21:58:24.410336976 +0000 UTC m=+674.571191116" watchObservedRunningTime="2026-04-20 21:58:24.4118965 +0000 UTC m=+674.572750642" Apr 20 21:58:24.427378 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.427326 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" podStartSLOduration=1.5075895670000001 podStartE2EDuration="4.427314604s" podCreationTimestamp="2026-04-20 21:58:20 +0000 UTC" firstStartedPulling="2026-04-20 21:58:20.969438708 +0000 UTC m=+671.130292827" lastFinishedPulling="2026-04-20 21:58:23.88916373 +0000 UTC m=+674.050017864" observedRunningTime="2026-04-20 21:58:24.426136322 +0000 UTC m=+674.586990466" watchObservedRunningTime="2026-04-20 21:58:24.427314604 +0000 UTC m=+674.588168744" Apr 20 21:58:24.441922 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.441878 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" podStartSLOduration=1.2012047670000001 podStartE2EDuration="4.441865223s" podCreationTimestamp="2026-04-20 21:58:20 +0000 UTC" firstStartedPulling="2026-04-20 21:58:20.638456482 +0000 UTC m=+670.799310600" lastFinishedPulling="2026-04-20 21:58:23.879116924 +0000 UTC m=+674.039971056" observedRunningTime="2026-04-20 21:58:24.440963838 +0000 UTC m=+674.601817989" watchObservedRunningTime="2026-04-20 21:58:24.441865223 +0000 UTC m=+674.602719363" Apr 20 21:58:24.631299 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.631277 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:24.764332 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.764251 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75kw\" (UniqueName: \"kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw\") pod \"dd508b32-6a93-4a5c-8d2a-8579af3558f2\" (UID: \"dd508b32-6a93-4a5c-8d2a-8579af3558f2\") " Apr 20 21:58:24.766391 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.766350 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw" (OuterVolumeSpecName: "kube-api-access-z75kw") pod "dd508b32-6a93-4a5c-8d2a-8579af3558f2" (UID: "dd508b32-6a93-4a5c-8d2a-8579af3558f2"). InnerVolumeSpecName "kube-api-access-z75kw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:24.865001 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:24.864962 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z75kw\" (UniqueName: \"kubernetes.io/projected/dd508b32-6a93-4a5c-8d2a-8579af3558f2-kube-api-access-z75kw\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:25.402531 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.402495 2574 generic.go:358] "Generic (PLEG): container finished" podID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" containerID="cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a" exitCode=0 Apr 20 21:58:25.402994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.402566 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" Apr 20 21:58:25.402994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.402587 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" event={"ID":"dd508b32-6a93-4a5c-8d2a-8579af3558f2","Type":"ContainerDied","Data":"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a"} Apr 20 21:58:25.402994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.402628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-knw9z" event={"ID":"dd508b32-6a93-4a5c-8d2a-8579af3558f2","Type":"ContainerDied","Data":"6d4d6aafd25e6e55b45e711ccd1aaacc84766ae598cb1a05ad991d821b74ba57"} Apr 20 21:58:25.402994 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.402645 2574 scope.go:117] "RemoveContainer" containerID="cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a" Apr 20 21:58:25.412923 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.412897 2574 scope.go:117] "RemoveContainer" containerID="cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a" Apr 20 21:58:25.413227 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:25.413204 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a\": container with ID starting with cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a not found: ID does not exist" containerID="cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a" Apr 20 21:58:25.413311 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.413237 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a"} err="failed to get container status \"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a\": rpc error: code = NotFound desc = could not find container \"cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a\": container with ID starting with cbceb2663e3ab9e522adbf2b988c18813faf4e4fba1539a123faf54882c7798a not found: ID does not exist" Apr 20 21:58:25.424708 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.424687 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:25.426882 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.426862 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-knw9z"] Apr 20 21:58:25.974756 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.974723 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:58:25.975040 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.975028 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" containerName="manager" Apr 20 21:58:25.975093 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.975042 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" containerName="manager" Apr 20 21:58:25.975127 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.975101 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" containerName="manager" Apr 20 21:58:25.977856 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.977840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:25.979648 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.979628 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 21:58:25.979850 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.979832 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 21:58:25.979955 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.979936 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-2zcsj\"" Apr 20 21:58:25.986772 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:25.986428 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:58:26.075385 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.075350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tq5j\" (UniqueName: \"kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.075536 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.075419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.176474 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.176444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tq5j\" (UniqueName: \"kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.176657 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.176489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.179210 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.179182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.183821 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.183798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tq5j\" (UniqueName: \"kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j\") pod \"maas-api-7dbc9b4b7f-8r55z\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.289423 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.289323 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:26.412223 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.412190 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:58:26.415435 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:26.415407 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac0fd3e8_6eeb_454e_9d76_62dd53fb36ae.slice/crio-117c7856c3f2a0e1204cff5845ff3eeb055e10b7214392341d53c16ad497f37d WatchSource:0}: Error finding container 117c7856c3f2a0e1204cff5845ff3eeb055e10b7214392341d53c16ad497f37d: Status 404 returned error can't find the container with id 117c7856c3f2a0e1204cff5845ff3eeb055e10b7214392341d53c16ad497f37d Apr 20 21:58:26.432080 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:26.432055 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd508b32-6a93-4a5c-8d2a-8579af3558f2" path="/var/lib/kubelet/pods/dd508b32-6a93-4a5c-8d2a-8579af3558f2/volumes" Apr 20 21:58:27.436632 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:27.436593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" event={"ID":"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae","Type":"ContainerStarted","Data":"117c7856c3f2a0e1204cff5845ff3eeb055e10b7214392341d53c16ad497f37d"} Apr 20 21:58:28.441268 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:28.441230 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" event={"ID":"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae","Type":"ContainerStarted","Data":"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37"} Apr 20 21:58:28.441667 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:28.441338 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:28.457115 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:28.457071 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" podStartSLOduration=2.2798726990000002 podStartE2EDuration="3.457057891s" podCreationTimestamp="2026-04-20 21:58:25 +0000 UTC" firstStartedPulling="2026-04-20 21:58:26.417180069 +0000 UTC m=+676.578034187" lastFinishedPulling="2026-04-20 21:58:27.594365261 +0000 UTC m=+677.755219379" observedRunningTime="2026-04-20 21:58:28.455352517 +0000 UTC m=+678.616206657" watchObservedRunningTime="2026-04-20 21:58:28.457057891 +0000 UTC m=+678.617912031" Apr 20 21:58:34.450348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:34.450319 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:58:35.407874 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.407840 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:35.408057 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.407980 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:35.459359 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.459329 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:35.464794 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.464735 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" podUID="ebfe9e71-aa87-4085-a923-378ebe4fa566" containerName="manager" containerID="cri-o://146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210" gracePeriod=10 Apr 20 21:58:35.701166 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.701143 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:35.741301 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.741274 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 21:58:35.741609 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.741597 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebfe9e71-aa87-4085-a923-378ebe4fa566" containerName="manager" Apr 20 21:58:35.741665 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.741611 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfe9e71-aa87-4085-a923-378ebe4fa566" containerName="manager" Apr 20 21:58:35.741665 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.741661 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebfe9e71-aa87-4085-a923-378ebe4fa566" containerName="manager" Apr 20 21:58:35.744945 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.744925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:35.751321 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.751298 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 21:58:35.856101 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.856068 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bckj5\" (UniqueName: \"kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5\") pod \"ebfe9e71-aa87-4085-a923-378ebe4fa566\" (UID: \"ebfe9e71-aa87-4085-a923-378ebe4fa566\") " Apr 20 21:58:35.856274 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.856198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtbm5\" (UniqueName: \"kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5\") pod \"maas-controller-57d97678df-v9bqv\" (UID: \"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6\") " pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:35.858099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.858075 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5" (OuterVolumeSpecName: "kube-api-access-bckj5") pod "ebfe9e71-aa87-4085-a923-378ebe4fa566" (UID: "ebfe9e71-aa87-4085-a923-378ebe4fa566"). InnerVolumeSpecName "kube-api-access-bckj5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:35.957518 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.957436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtbm5\" (UniqueName: \"kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5\") pod \"maas-controller-57d97678df-v9bqv\" (UID: \"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6\") " pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:35.957518 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.957488 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bckj5\" (UniqueName: \"kubernetes.io/projected/ebfe9e71-aa87-4085-a923-378ebe4fa566-kube-api-access-bckj5\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:35.965655 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:35.965622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtbm5\" (UniqueName: \"kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5\") pod \"maas-controller-57d97678df-v9bqv\" (UID: \"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6\") " pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:36.057458 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.057418 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:36.173098 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.173073 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 21:58:36.174927 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:58:36.174901 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298f5698_70ee_4fcb_b2d6_b7238bc3d4e6.slice/crio-f669eeb73410ece75f1bb04ede07397829b8bb51d78a7e07322ea4b06a08ecb8 WatchSource:0}: Error finding container f669eeb73410ece75f1bb04ede07397829b8bb51d78a7e07322ea4b06a08ecb8: Status 404 returned error can't find the container with id f669eeb73410ece75f1bb04ede07397829b8bb51d78a7e07322ea4b06a08ecb8 Apr 20 21:58:36.468740 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.468705 2574 generic.go:358] "Generic (PLEG): container finished" podID="ebfe9e71-aa87-4085-a923-378ebe4fa566" containerID="146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210" exitCode=0 Apr 20 21:58:36.469121 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.468780 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" Apr 20 21:58:36.469121 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.468794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" event={"ID":"ebfe9e71-aa87-4085-a923-378ebe4fa566","Type":"ContainerDied","Data":"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210"} Apr 20 21:58:36.469121 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.468842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7b5fdb989c-xz9xk" event={"ID":"ebfe9e71-aa87-4085-a923-378ebe4fa566","Type":"ContainerDied","Data":"1bb1e8a81cb35670e4ae1d8e4bfe321f2bf12b9542b955a08f955c4b7baf8962"} Apr 20 21:58:36.469121 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.468865 2574 scope.go:117] "RemoveContainer" containerID="146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210" Apr 20 21:58:36.470081 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.469898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-v9bqv" event={"ID":"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6","Type":"ContainerStarted","Data":"f669eeb73410ece75f1bb04ede07397829b8bb51d78a7e07322ea4b06a08ecb8"} Apr 20 21:58:36.482142 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.482113 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:36.484875 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.484856 2574 scope.go:117] "RemoveContainer" containerID="146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210" Apr 20 21:58:36.485190 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:36.485163 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210\": container with ID starting with 146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210 not found: ID does not exist" containerID="146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210" Apr 20 21:58:36.485262 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.485202 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210"} err="failed to get container status \"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210\": rpc error: code = NotFound desc = could not find container \"146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210\": container with ID starting with 146327dce7078c7a1fa669bf1cd9902dc45ff43e8333f44003b1a22342cd3210 not found: ID does not exist" Apr 20 21:58:36.485977 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:36.485952 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-7b5fdb989c-xz9xk"] Apr 20 21:58:37.474492 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:37.474456 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-v9bqv" event={"ID":"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6","Type":"ContainerStarted","Data":"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f"} Apr 20 21:58:37.474920 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:37.474565 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:37.490440 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:37.490359 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-57d97678df-v9bqv" podStartSLOduration=2.096692896 podStartE2EDuration="2.4903467s" podCreationTimestamp="2026-04-20 21:58:35 +0000 UTC" firstStartedPulling="2026-04-20 21:58:36.176143359 +0000 UTC m=+686.336997477" lastFinishedPulling="2026-04-20 21:58:36.569797162 +0000 UTC m=+686.730651281" observedRunningTime="2026-04-20 21:58:37.488573724 +0000 UTC m=+687.649427864" watchObservedRunningTime="2026-04-20 21:58:37.4903467 +0000 UTC m=+687.651200839" Apr 20 21:58:38.432180 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:38.432147 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfe9e71-aa87-4085-a923-378ebe4fa566" path="/var/lib/kubelet/pods/ebfe9e71-aa87-4085-a923-378ebe4fa566/volumes" Apr 20 21:58:48.482778 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.482752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 21:58:48.520171 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.520140 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:48.520426 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.520401 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" podUID="2e592db2-d36a-458f-84e5-ac95202637fa" containerName="manager" containerID="cri-o://924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481" gracePeriod=10 Apr 20 21:58:48.755833 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.755812 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:48.862992 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.862959 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vldlk\" (UniqueName: \"kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk\") pod \"2e592db2-d36a-458f-84e5-ac95202637fa\" (UID: \"2e592db2-d36a-458f-84e5-ac95202637fa\") " Apr 20 21:58:48.864915 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.864889 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk" (OuterVolumeSpecName: "kube-api-access-vldlk") pod "2e592db2-d36a-458f-84e5-ac95202637fa" (UID: "2e592db2-d36a-458f-84e5-ac95202637fa"). InnerVolumeSpecName "kube-api-access-vldlk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:58:48.964214 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:48.964183 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vldlk\" (UniqueName: \"kubernetes.io/projected/2e592db2-d36a-458f-84e5-ac95202637fa-kube-api-access-vldlk\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:58:49.510230 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.510189 2574 generic.go:358] "Generic (PLEG): container finished" podID="2e592db2-d36a-458f-84e5-ac95202637fa" containerID="924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481" exitCode=0 Apr 20 21:58:49.510717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.510327 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" event={"ID":"2e592db2-d36a-458f-84e5-ac95202637fa","Type":"ContainerDied","Data":"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481"} Apr 20 21:58:49.510717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.510356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" event={"ID":"2e592db2-d36a-458f-84e5-ac95202637fa","Type":"ContainerDied","Data":"942bfe8a8262649d39535f3aaccb4f5d51c39be63b8a2fa442da35eac4bdb2d4"} Apr 20 21:58:49.510717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.510395 2574 scope.go:117] "RemoveContainer" containerID="924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481" Apr 20 21:58:49.510717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.510538 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58cc7b9fdb-6q8k6" Apr 20 21:58:49.520431 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.520410 2574 scope.go:117] "RemoveContainer" containerID="924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481" Apr 20 21:58:49.520716 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:58:49.520696 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481\": container with ID starting with 924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481 not found: ID does not exist" containerID="924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481" Apr 20 21:58:49.520759 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.520726 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481"} err="failed to get container status \"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481\": rpc error: code = NotFound desc = could not find container \"924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481\": container with ID starting with 924b006465a842c3d402a6896c7d444a7a10ff656703cada164171be8f154481 not found: ID does not exist" Apr 20 21:58:49.533331 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.533308 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:49.535530 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:49.535508 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-58cc7b9fdb-6q8k6"] Apr 20 21:58:50.430978 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:58:50.430945 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e592db2-d36a-458f-84e5-ac95202637fa" path="/var/lib/kubelet/pods/2e592db2-d36a-458f-84e5-ac95202637fa/volumes" Apr 20 21:59:08.578660 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.578626 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-86f4dd7d58-5gqpr"] Apr 20 21:59:08.579022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.578900 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e592db2-d36a-458f-84e5-ac95202637fa" containerName="manager" Apr 20 21:59:08.579022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.578910 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e592db2-d36a-458f-84e5-ac95202637fa" containerName="manager" Apr 20 21:59:08.579022 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.578969 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e592db2-d36a-458f-84e5-ac95202637fa" containerName="manager" Apr 20 21:59:08.581839 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.581817 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.587404 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.587357 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86f4dd7d58-5gqpr"] Apr 20 21:59:08.606769 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.606740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l77kh\" (UniqueName: \"kubernetes.io/projected/3092ae0d-af00-4349-86b0-fe8c233ebbb4-kube-api-access-l77kh\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.606900 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.606774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3092ae0d-af00-4349-86b0-fe8c233ebbb4-maas-api-tls\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.708061 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.708030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l77kh\" (UniqueName: \"kubernetes.io/projected/3092ae0d-af00-4349-86b0-fe8c233ebbb4-kube-api-access-l77kh\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.708061 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.708069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3092ae0d-af00-4349-86b0-fe8c233ebbb4-maas-api-tls\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.710343 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.710321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3092ae0d-af00-4349-86b0-fe8c233ebbb4-maas-api-tls\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.717005 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.716979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l77kh\" (UniqueName: \"kubernetes.io/projected/3092ae0d-af00-4349-86b0-fe8c233ebbb4-kube-api-access-l77kh\") pod \"maas-api-86f4dd7d58-5gqpr\" (UID: \"3092ae0d-af00-4349-86b0-fe8c233ebbb4\") " pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:08.893097 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:08.893071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:09.008515 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:09.008489 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-86f4dd7d58-5gqpr"] Apr 20 21:59:09.010933 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:59:09.010905 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3092ae0d_af00_4349_86b0_fe8c233ebbb4.slice/crio-9b32c093533f11821b8fa009e528bf69bd11f0ef79ea2e7728c0419249cbdc59 WatchSource:0}: Error finding container 9b32c093533f11821b8fa009e528bf69bd11f0ef79ea2e7728c0419249cbdc59: Status 404 returned error can't find the container with id 9b32c093533f11821b8fa009e528bf69bd11f0ef79ea2e7728c0419249cbdc59 Apr 20 21:59:09.573089 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:09.573050 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" event={"ID":"3092ae0d-af00-4349-86b0-fe8c233ebbb4","Type":"ContainerStarted","Data":"9b32c093533f11821b8fa009e528bf69bd11f0ef79ea2e7728c0419249cbdc59"} Apr 20 21:59:11.580348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:11.580310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" event={"ID":"3092ae0d-af00-4349-86b0-fe8c233ebbb4","Type":"ContainerStarted","Data":"68850499e74854a7e27f31178728a339396549e450edac74b86aa29b746eb917"} Apr 20 21:59:11.580744 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:11.580491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:11.595065 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:11.594991 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" podStartSLOduration=2.004870448 podStartE2EDuration="3.594977405s" podCreationTimestamp="2026-04-20 21:59:08 +0000 UTC" firstStartedPulling="2026-04-20 21:59:09.01224921 +0000 UTC m=+719.173103339" lastFinishedPulling="2026-04-20 21:59:10.602356165 +0000 UTC m=+720.763210296" observedRunningTime="2026-04-20 21:59:11.594444676 +0000 UTC m=+721.755298817" watchObservedRunningTime="2026-04-20 21:59:11.594977405 +0000 UTC m=+721.755831545" Apr 20 21:59:14.579631 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.579596 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj"] Apr 20 21:59:14.582883 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.582866 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.584849 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.584829 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 21:59:14.585071 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.585056 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 21:59:14.585356 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.585330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 21:59:14.585495 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.585480 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wxx6x\"" Apr 20 21:59:14.590880 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.590857 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj"] Apr 20 21:59:14.651987 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.651961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.652125 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.652012 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3001c74-34bd-49d8-9a51-85d1c4df58a4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.652125 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.652056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.652125 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.652073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.652125 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.652092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzd72\" (UniqueName: \"kubernetes.io/projected/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kube-api-access-kzd72\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.652285 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.652192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.752948 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.752916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.752959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.752997 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3001c74-34bd-49d8-9a51-85d1c4df58a4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753099 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzd72\" (UniqueName: \"kubernetes.io/projected/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kube-api-access-kzd72\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753385 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753469 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.753519 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.753459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.755265 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.755245 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b3001c74-34bd-49d8-9a51-85d1c4df58a4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.755532 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.755512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3001c74-34bd-49d8-9a51-85d1c4df58a4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.760618 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.760601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzd72\" (UniqueName: \"kubernetes.io/projected/b3001c74-34bd-49d8-9a51-85d1c4df58a4-kube-api-access-kzd72\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w4qcj\" (UID: \"b3001c74-34bd-49d8-9a51-85d1c4df58a4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:14.894252 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:14.894222 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:15.014003 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:15.013980 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj"] Apr 20 21:59:15.016297 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:59:15.016269 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3001c74_34bd_49d8_9a51_85d1c4df58a4.slice/crio-764600a16591ea2c43a4c880dcc905f8393741065e06523a7a2aa4ef36fc35ef WatchSource:0}: Error finding container 764600a16591ea2c43a4c880dcc905f8393741065e06523a7a2aa4ef36fc35ef: Status 404 returned error can't find the container with id 764600a16591ea2c43a4c880dcc905f8393741065e06523a7a2aa4ef36fc35ef Apr 20 21:59:15.595808 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:15.595772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" event={"ID":"b3001c74-34bd-49d8-9a51-85d1c4df58a4","Type":"ContainerStarted","Data":"764600a16591ea2c43a4c880dcc905f8393741065e06523a7a2aa4ef36fc35ef"} Apr 20 21:59:17.590782 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.590748 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-86f4dd7d58-5gqpr" Apr 20 21:59:17.635695 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.635658 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:59:17.636066 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.636001 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" podUID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" containerName="maas-api" containerID="cri-o://7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37" gracePeriod=30 Apr 20 21:59:17.889915 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.889878 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:59:17.981437 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.981388 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls\") pod \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " Apr 20 21:59:17.981613 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.981511 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tq5j\" (UniqueName: \"kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j\") pod \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\" (UID: \"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae\") " Apr 20 21:59:17.983956 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.983925 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" (UID: "ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:59:17.984079 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:17.983933 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j" (OuterVolumeSpecName: "kube-api-access-2tq5j") pod "ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" (UID: "ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae"). InnerVolumeSpecName "kube-api-access-2tq5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:59:18.082268 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.082213 2574 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-maas-api-tls\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:59:18.082268 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.082269 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tq5j\" (UniqueName: \"kubernetes.io/projected/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae-kube-api-access-2tq5j\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 21:59:18.608322 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.608291 2574 generic.go:358] "Generic (PLEG): container finished" podID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" containerID="7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37" exitCode=0 Apr 20 21:59:18.608798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.608391 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" Apr 20 21:59:18.608798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.608394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" event={"ID":"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae","Type":"ContainerDied","Data":"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37"} Apr 20 21:59:18.608798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.608434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dbc9b4b7f-8r55z" event={"ID":"ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae","Type":"ContainerDied","Data":"117c7856c3f2a0e1204cff5845ff3eeb055e10b7214392341d53c16ad497f37d"} Apr 20 21:59:18.608798 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.608467 2574 scope.go:117] "RemoveContainer" containerID="7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37" Apr 20 21:59:18.617265 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.617236 2574 scope.go:117] "RemoveContainer" containerID="7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37" Apr 20 21:59:18.617625 ip-10-0-140-110 kubenswrapper[2574]: E0420 21:59:18.617590 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37\": container with ID starting with 7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37 not found: ID does not exist" containerID="7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37" Apr 20 21:59:18.617717 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.617638 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37"} err="failed to get container status \"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37\": rpc error: code = NotFound desc = could not find container \"7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37\": container with ID starting with 7a3e67fc5d1f713d0f85350830197ad6e6352a78786c613be5eed70ced99ba37 not found: ID does not exist" Apr 20 21:59:18.629074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.629043 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:59:18.630574 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:18.630554 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7dbc9b4b7f-8r55z"] Apr 20 21:59:20.432490 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:20.432453 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" path="/var/lib/kubelet/pods/ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae/volumes" Apr 20 21:59:21.621271 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:21.621235 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" event={"ID":"b3001c74-34bd-49d8-9a51-85d1c4df58a4","Type":"ContainerStarted","Data":"d72625773f6e6427ceca9c7f76878a14ae07bb3ac46c16653aa81377ae334c81"} Apr 20 21:59:26.641440 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:26.641402 2574 generic.go:358] "Generic (PLEG): container finished" podID="b3001c74-34bd-49d8-9a51-85d1c4df58a4" containerID="d72625773f6e6427ceca9c7f76878a14ae07bb3ac46c16653aa81377ae334c81" exitCode=0 Apr 20 21:59:26.641749 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:26.641474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" event={"ID":"b3001c74-34bd-49d8-9a51-85d1c4df58a4","Type":"ContainerDied","Data":"d72625773f6e6427ceca9c7f76878a14ae07bb3ac46c16653aa81377ae334c81"} Apr 20 21:59:26.642083 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:26.642069 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:59:27.258574 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.258540 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl"] Apr 20 21:59:27.258964 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.258948 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" containerName="maas-api" Apr 20 21:59:27.259036 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.258968 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" containerName="maas-api" Apr 20 21:59:27.259092 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.259036 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac0fd3e8-6eeb-454e-9d76-62dd53fb36ae" containerName="maas-api" Apr 20 21:59:27.262133 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.262111 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.263989 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.263966 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 21:59:27.268664 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.268624 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl"] Apr 20 21:59:27.364872 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.364831 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.365033 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.364888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.365033 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.364996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.365168 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.365059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02db2c9e-660f-478b-863b-7dca60b9a8b1-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.365168 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.365084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdqs\" (UniqueName: \"kubernetes.io/projected/02db2c9e-660f-478b-863b-7dca60b9a8b1-kube-api-access-bjdqs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.365168 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.365126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466386 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466557 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466472 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466557 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466557 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466719 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02db2c9e-660f-478b-863b-7dca60b9a8b1-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466719 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdqs\" (UniqueName: \"kubernetes.io/projected/02db2c9e-660f-478b-863b-7dca60b9a8b1-kube-api-access-bjdqs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.466903 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.466876 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.467234 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.467178 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.467234 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.467201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.468967 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.468934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/02db2c9e-660f-478b-863b-7dca60b9a8b1-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.469333 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.469309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/02db2c9e-660f-478b-863b-7dca60b9a8b1-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.473660 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.473641 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdqs\" (UniqueName: \"kubernetes.io/projected/02db2c9e-660f-478b-863b-7dca60b9a8b1-kube-api-access-bjdqs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl\" (UID: \"02db2c9e-660f-478b-863b-7dca60b9a8b1\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.576758 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.576681 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:27.789581 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:27.789555 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl"] Apr 20 21:59:27.792096 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:59:27.792065 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02db2c9e_660f_478b_863b_7dca60b9a8b1.slice/crio-78bb7c2581af0c369c5fdb13d6450a50df449f2b076c2dfdf98eb5d878f782a9 WatchSource:0}: Error finding container 78bb7c2581af0c369c5fdb13d6450a50df449f2b076c2dfdf98eb5d878f782a9: Status 404 returned error can't find the container with id 78bb7c2581af0c369c5fdb13d6450a50df449f2b076c2dfdf98eb5d878f782a9 Apr 20 21:59:28.651031 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:28.650986 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" event={"ID":"b3001c74-34bd-49d8-9a51-85d1c4df58a4","Type":"ContainerStarted","Data":"cb617ab21ac13ff1e22bebbe8e747460e59de504fd216e60e4d27135a615a1c9"} Apr 20 21:59:28.651218 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:28.651201 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:28.652338 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:28.652311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" event={"ID":"02db2c9e-660f-478b-863b-7dca60b9a8b1","Type":"ContainerStarted","Data":"0d8b1aa74f31ba71608ec7ffe0104f6f3da0138bc4a04fde0d5082a63feb5532"} Apr 20 21:59:28.652338 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:28.652341 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" event={"ID":"02db2c9e-660f-478b-863b-7dca60b9a8b1","Type":"ContainerStarted","Data":"78bb7c2581af0c369c5fdb13d6450a50df449f2b076c2dfdf98eb5d878f782a9"} Apr 20 21:59:28.669290 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:28.669226 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" podStartSLOduration=1.9630429440000001 podStartE2EDuration="14.669213572s" podCreationTimestamp="2026-04-20 21:59:14 +0000 UTC" firstStartedPulling="2026-04-20 21:59:15.018103647 +0000 UTC m=+725.178957769" lastFinishedPulling="2026-04-20 21:59:27.724274267 +0000 UTC m=+737.885128397" observedRunningTime="2026-04-20 21:59:28.66780303 +0000 UTC m=+738.828657171" watchObservedRunningTime="2026-04-20 21:59:28.669213572 +0000 UTC m=+738.830067738" Apr 20 21:59:33.670790 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:33.670754 2574 generic.go:358] "Generic (PLEG): container finished" podID="02db2c9e-660f-478b-863b-7dca60b9a8b1" containerID="0d8b1aa74f31ba71608ec7ffe0104f6f3da0138bc4a04fde0d5082a63feb5532" exitCode=0 Apr 20 21:59:33.671305 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:33.670833 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" event={"ID":"02db2c9e-660f-478b-863b-7dca60b9a8b1","Type":"ContainerDied","Data":"0d8b1aa74f31ba71608ec7ffe0104f6f3da0138bc4a04fde0d5082a63feb5532"} Apr 20 21:59:34.675136 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:34.675098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" event={"ID":"02db2c9e-660f-478b-863b-7dca60b9a8b1","Type":"ContainerStarted","Data":"be92509f6996279bde1573ca7918322090a8416ed51acf48d9ccba4816eb5286"} Apr 20 21:59:34.675542 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:34.675320 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:34.693199 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:34.693142 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" podStartSLOduration=7.496616013 podStartE2EDuration="7.693128349s" podCreationTimestamp="2026-04-20 21:59:27 +0000 UTC" firstStartedPulling="2026-04-20 21:59:33.671594575 +0000 UTC m=+743.832448693" lastFinishedPulling="2026-04-20 21:59:33.8681069 +0000 UTC m=+744.028961029" observedRunningTime="2026-04-20 21:59:34.690825514 +0000 UTC m=+744.851679653" watchObservedRunningTime="2026-04-20 21:59:34.693128349 +0000 UTC m=+744.853982505" Apr 20 21:59:39.668234 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:39.668204 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w4qcj" Apr 20 21:59:45.691708 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:45.691672 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl" Apr 20 21:59:50.655914 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.655879 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6"] Apr 20 21:59:50.688941 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.688911 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6"] Apr 20 21:59:50.689094 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.689018 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.691054 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.691035 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 21:59:50.766650 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.766814 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.766814 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srb4j\" (UniqueName: \"kubernetes.io/projected/3e1733d6-b767-45b5-ac8e-a388810e916b-kube-api-access-srb4j\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.766814 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.766814 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766732 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1733d6-b767-45b5-ac8e-a388810e916b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.766814 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.766749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867144 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srb4j\" (UniqueName: \"kubernetes.io/projected/3e1733d6-b767-45b5-ac8e-a388810e916b-kube-api-access-srb4j\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867485 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867166 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867485 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1733d6-b767-45b5-ac8e-a388810e916b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867485 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867627 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867627 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867572 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867807 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.867924 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.867784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.869328 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.869303 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3e1733d6-b767-45b5-ac8e-a388810e916b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.869785 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.869763 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3e1733d6-b767-45b5-ac8e-a388810e916b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.874120 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.874095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srb4j\" (UniqueName: \"kubernetes.io/projected/3e1733d6-b767-45b5-ac8e-a388810e916b-kube-api-access-srb4j\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6\" (UID: \"3e1733d6-b767-45b5-ac8e-a388810e916b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:50.998222 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:50.998135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 21:59:51.121096 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:51.121054 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6"] Apr 20 21:59:51.124402 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:59:51.124319 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e1733d6_b767_45b5_ac8e_a388810e916b.slice/crio-d9ae656e5945404a7609549906437b2a5175b00e5a1ab7458d648b987ba33906 WatchSource:0}: Error finding container d9ae656e5945404a7609549906437b2a5175b00e5a1ab7458d648b987ba33906: Status 404 returned error can't find the container with id d9ae656e5945404a7609549906437b2a5175b00e5a1ab7458d648b987ba33906 Apr 20 21:59:51.736110 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:51.736078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" event={"ID":"3e1733d6-b767-45b5-ac8e-a388810e916b","Type":"ContainerStarted","Data":"2cbc93cdd621b0b6205fcff6107f9c5b7779d56d6cab4ed0c0548173580af0e6"} Apr 20 21:59:51.736110 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:51.736114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" event={"ID":"3e1733d6-b767-45b5-ac8e-a388810e916b","Type":"ContainerStarted","Data":"d9ae656e5945404a7609549906437b2a5175b00e5a1ab7458d648b987ba33906"} Apr 20 21:59:58.355231 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.355184 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x"] Apr 20 21:59:58.360622 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.360593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.362862 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.362835 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 21:59:58.366148 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.366119 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x"] Apr 20 21:59:58.430609 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2be12-e84a-4da0-8aac-0090aca11c46-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.430793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tsfg\" (UniqueName: \"kubernetes.io/projected/e5d2be12-e84a-4da0-8aac-0090aca11c46-kube-api-access-6tsfg\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.430793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.430793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.430793 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.430999 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.430815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.531832 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.531793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.531832 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.531831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.531832 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.531857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.531832 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.531886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.532291 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.532065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2be12-e84a-4da0-8aac-0090aca11c46-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.532348 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.532323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tsfg\" (UniqueName: \"kubernetes.io/projected/e5d2be12-e84a-4da0-8aac-0090aca11c46-kube-api-access-6tsfg\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.532442 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.532404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.532524 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.532502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.532855 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.532527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.535141 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.535091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5d2be12-e84a-4da0-8aac-0090aca11c46-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.535296 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.535265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5d2be12-e84a-4da0-8aac-0090aca11c46-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.545286 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.545257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tsfg\" (UniqueName: \"kubernetes.io/projected/e5d2be12-e84a-4da0-8aac-0090aca11c46-kube-api-access-6tsfg\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x\" (UID: \"e5d2be12-e84a-4da0-8aac-0090aca11c46\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.674504 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.674412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 21:59:58.805530 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:58.805454 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x"] Apr 20 21:59:58.809873 ip-10-0-140-110 kubenswrapper[2574]: W0420 21:59:58.809843 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5d2be12_e84a_4da0_8aac_0090aca11c46.slice/crio-11ab7084687ba3402df16ad16ea6a28224a3b82460dcc2a8c1e1a7205b3a3078 WatchSource:0}: Error finding container 11ab7084687ba3402df16ad16ea6a28224a3b82460dcc2a8c1e1a7205b3a3078: Status 404 returned error can't find the container with id 11ab7084687ba3402df16ad16ea6a28224a3b82460dcc2a8c1e1a7205b3a3078 Apr 20 21:59:59.765074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:59.765037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" event={"ID":"e5d2be12-e84a-4da0-8aac-0090aca11c46","Type":"ContainerStarted","Data":"ece4e9af038dd660f77cf251663030671f7e04141fd8e30aa79790c654fe9f61"} Apr 20 21:59:59.765074 ip-10-0-140-110 kubenswrapper[2574]: I0420 21:59:59.765076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" event={"ID":"e5d2be12-e84a-4da0-8aac-0090aca11c46","Type":"ContainerStarted","Data":"11ab7084687ba3402df16ad16ea6a28224a3b82460dcc2a8c1e1a7205b3a3078"} Apr 20 22:00:00.769736 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:00.769698 2574 generic.go:358] "Generic (PLEG): container finished" podID="3e1733d6-b767-45b5-ac8e-a388810e916b" containerID="2cbc93cdd621b0b6205fcff6107f9c5b7779d56d6cab4ed0c0548173580af0e6" exitCode=0 Apr 20 22:00:00.770200 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:00.769787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" event={"ID":"3e1733d6-b767-45b5-ac8e-a388810e916b","Type":"ContainerDied","Data":"2cbc93cdd621b0b6205fcff6107f9c5b7779d56d6cab4ed0c0548173580af0e6"} Apr 20 22:00:02.778957 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:02.778914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" event={"ID":"3e1733d6-b767-45b5-ac8e-a388810e916b","Type":"ContainerStarted","Data":"e9681805501ce222023661bff753f33234e1179502347772b5c24ce054d5bed5"} Apr 20 22:00:02.779405 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:02.779140 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 22:00:02.797734 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:02.797686 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" podStartSLOduration=11.815986109 podStartE2EDuration="12.797671971s" podCreationTimestamp="2026-04-20 21:59:50 +0000 UTC" firstStartedPulling="2026-04-20 22:00:00.770538437 +0000 UTC m=+770.931392554" lastFinishedPulling="2026-04-20 22:00:01.752224287 +0000 UTC m=+771.913078416" observedRunningTime="2026-04-20 22:00:02.796260081 +0000 UTC m=+772.957114222" watchObservedRunningTime="2026-04-20 22:00:02.797671971 +0000 UTC m=+772.958526110" Apr 20 22:00:04.787415 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:04.787360 2574 generic.go:358] "Generic (PLEG): container finished" podID="e5d2be12-e84a-4da0-8aac-0090aca11c46" containerID="ece4e9af038dd660f77cf251663030671f7e04141fd8e30aa79790c654fe9f61" exitCode=0 Apr 20 22:00:04.787832 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:04.787440 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" event={"ID":"e5d2be12-e84a-4da0-8aac-0090aca11c46","Type":"ContainerDied","Data":"ece4e9af038dd660f77cf251663030671f7e04141fd8e30aa79790c654fe9f61"} Apr 20 22:00:10.810557 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:10.810520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" event={"ID":"e5d2be12-e84a-4da0-8aac-0090aca11c46","Type":"ContainerStarted","Data":"6d4e7c38d257e3e35599698f885234495026d4c721bf987c4c5293b07b0584b5"} Apr 20 22:00:10.810963 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:10.810758 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 22:00:10.829229 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:10.829124 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" podStartSLOduration=7.835401108 podStartE2EDuration="12.829109537s" podCreationTimestamp="2026-04-20 21:59:58 +0000 UTC" firstStartedPulling="2026-04-20 22:00:04.788003308 +0000 UTC m=+774.948857425" lastFinishedPulling="2026-04-20 22:00:09.781711732 +0000 UTC m=+779.942565854" observedRunningTime="2026-04-20 22:00:10.827849449 +0000 UTC m=+780.988703592" watchObservedRunningTime="2026-04-20 22:00:10.829109537 +0000 UTC m=+780.989963676" Apr 20 22:00:13.796417 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:13.796354 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6" Apr 20 22:00:21.827064 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:21.827035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x" Apr 20 22:00:46.132965 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.132886 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d5979b6b9-vdn8z"] Apr 20 22:00:46.136103 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.136085 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.143642 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.143619 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d5979b6b9-vdn8z"] Apr 20 22:00:46.245944 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.245887 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9kbr\" (UniqueName: \"kubernetes.io/projected/84c88f93-7f08-4188-8989-9fee1fdc4df8-kube-api-access-n9kbr\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.246137 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.245969 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84c88f93-7f08-4188-8989-9fee1fdc4df8-tls-cert\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.346748 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.346709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84c88f93-7f08-4188-8989-9fee1fdc4df8-tls-cert\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.346903 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.346807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9kbr\" (UniqueName: \"kubernetes.io/projected/84c88f93-7f08-4188-8989-9fee1fdc4df8-kube-api-access-n9kbr\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.349170 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.349151 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/84c88f93-7f08-4188-8989-9fee1fdc4df8-tls-cert\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.355786 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.355765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9kbr\" (UniqueName: \"kubernetes.io/projected/84c88f93-7f08-4188-8989-9fee1fdc4df8-kube-api-access-n9kbr\") pod \"authorino-7d5979b6b9-vdn8z\" (UID: \"84c88f93-7f08-4188-8989-9fee1fdc4df8\") " pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.445834 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.445760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" Apr 20 22:00:46.563082 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.562979 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d5979b6b9-vdn8z"] Apr 20 22:00:46.565203 ip-10-0-140-110 kubenswrapper[2574]: W0420 22:00:46.565172 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84c88f93_7f08_4188_8989_9fee1fdc4df8.slice/crio-ec0f2f67f4a536105c803ccd6b527caca3c78a027f0499fc83419b43ad05b7a4 WatchSource:0}: Error finding container ec0f2f67f4a536105c803ccd6b527caca3c78a027f0499fc83419b43ad05b7a4: Status 404 returned error can't find the container with id ec0f2f67f4a536105c803ccd6b527caca3c78a027f0499fc83419b43ad05b7a4 Apr 20 22:00:46.925299 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:46.925271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" event={"ID":"84c88f93-7f08-4188-8989-9fee1fdc4df8","Type":"ContainerStarted","Data":"ec0f2f67f4a536105c803ccd6b527caca3c78a027f0499fc83419b43ad05b7a4"} Apr 20 22:00:47.931115 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:47.931064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" event={"ID":"84c88f93-7f08-4188-8989-9fee1fdc4df8","Type":"ContainerStarted","Data":"5af1c512e9fcc9a4867162d70baea49d4284ed6a734a027d40df06b1bdd7b846"} Apr 20 22:00:47.944567 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:47.944512 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7d5979b6b9-vdn8z" podStartSLOduration=1.6485256769999999 podStartE2EDuration="1.944492619s" podCreationTimestamp="2026-04-20 22:00:46 +0000 UTC" firstStartedPulling="2026-04-20 22:00:46.566456929 +0000 UTC m=+816.727311047" lastFinishedPulling="2026-04-20 22:00:46.862423856 +0000 UTC m=+817.023277989" observedRunningTime="2026-04-20 22:00:47.943698825 +0000 UTC m=+818.104552964" watchObservedRunningTime="2026-04-20 22:00:47.944492619 +0000 UTC m=+818.105346763" Apr 20 22:00:47.969588 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:47.969558 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 22:00:47.969774 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:47.969741 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-bb8f8449b-txvkq" podUID="7a3ee233-4a16-4f4e-b4af-c93a965da896" containerName="authorino" containerID="cri-o://f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71" gracePeriod=30 Apr 20 22:00:48.215278 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.215250 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 22:00:48.262357 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.262325 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert\") pod \"7a3ee233-4a16-4f4e-b4af-c93a965da896\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " Apr 20 22:00:48.262533 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.262401 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk5jc\" (UniqueName: \"kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc\") pod \"7a3ee233-4a16-4f4e-b4af-c93a965da896\" (UID: \"7a3ee233-4a16-4f4e-b4af-c93a965da896\") " Apr 20 22:00:48.264357 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.264326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc" (OuterVolumeSpecName: "kube-api-access-wk5jc") pod "7a3ee233-4a16-4f4e-b4af-c93a965da896" (UID: "7a3ee233-4a16-4f4e-b4af-c93a965da896"). InnerVolumeSpecName "kube-api-access-wk5jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:00:48.271746 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.271721 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "7a3ee233-4a16-4f4e-b4af-c93a965da896" (UID: "7a3ee233-4a16-4f4e-b4af-c93a965da896"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:00:48.363615 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.363580 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7a3ee233-4a16-4f4e-b4af-c93a965da896-tls-cert\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 22:00:48.363744 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.363619 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wk5jc\" (UniqueName: \"kubernetes.io/projected/7a3ee233-4a16-4f4e-b4af-c93a965da896-kube-api-access-wk5jc\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 22:00:48.935531 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.935498 2574 generic.go:358] "Generic (PLEG): container finished" podID="7a3ee233-4a16-4f4e-b4af-c93a965da896" containerID="f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71" exitCode=0 Apr 20 22:00:48.935966 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.935549 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-bb8f8449b-txvkq" Apr 20 22:00:48.935966 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.935590 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bb8f8449b-txvkq" event={"ID":"7a3ee233-4a16-4f4e-b4af-c93a965da896","Type":"ContainerDied","Data":"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71"} Apr 20 22:00:48.935966 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.935635 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-bb8f8449b-txvkq" event={"ID":"7a3ee233-4a16-4f4e-b4af-c93a965da896","Type":"ContainerDied","Data":"7ae6537571630a797c4eac23882ac7dbd8e2e5485f74732c5ea90fd581033e0c"} Apr 20 22:00:48.935966 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.935658 2574 scope.go:117] "RemoveContainer" containerID="f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71" Apr 20 22:00:48.943456 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.943442 2574 scope.go:117] "RemoveContainer" containerID="f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71" Apr 20 22:00:48.943693 ip-10-0-140-110 kubenswrapper[2574]: E0420 22:00:48.943674 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71\": container with ID starting with f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71 not found: ID does not exist" containerID="f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71" Apr 20 22:00:48.943754 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.943706 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71"} err="failed to get container status \"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71\": rpc error: code = NotFound desc = could not find container \"f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71\": container with ID starting with f2f077687da9ebc1c09d5002d69b5bf2c894fefeaf1dac7eb972bb1b2dfd2d71 not found: ID does not exist" Apr 20 22:00:48.951528 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.951507 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 22:00:48.954871 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:48.954852 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-bb8f8449b-txvkq"] Apr 20 22:00:50.431228 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:00:50.431193 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3ee233-4a16-4f4e-b4af-c93a965da896" path="/var/lib/kubelet/pods/7a3ee233-4a16-4f4e-b4af-c93a965da896/volumes" Apr 20 22:02:10.364028 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:10.364001 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:02:10.365015 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:10.364996 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:02:13.327918 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.327886 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 22:02:13.328322 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.328140 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-57d97678df-v9bqv" podUID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" containerName="manager" containerID="cri-o://8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f" gracePeriod=10 Apr 20 22:02:13.568093 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.568069 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 22:02:13.705945 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.705909 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtbm5\" (UniqueName: \"kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5\") pod \"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6\" (UID: \"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6\") " Apr 20 22:02:13.707936 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.707907 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5" (OuterVolumeSpecName: "kube-api-access-vtbm5") pod "298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" (UID: "298f5698-70ee-4fcb-b2d6-b7238bc3d4e6"). InnerVolumeSpecName "kube-api-access-vtbm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:02:13.807386 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:13.807350 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtbm5\" (UniqueName: \"kubernetes.io/projected/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6-kube-api-access-vtbm5\") on node \"ip-10-0-140-110.ec2.internal\" DevicePath \"\"" Apr 20 22:02:14.214596 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.214555 2574 generic.go:358] "Generic (PLEG): container finished" podID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" containerID="8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f" exitCode=0 Apr 20 22:02:14.214796 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.214629 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-v9bqv" Apr 20 22:02:14.214796 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.214647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-v9bqv" event={"ID":"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6","Type":"ContainerDied","Data":"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f"} Apr 20 22:02:14.214796 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.214683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-v9bqv" event={"ID":"298f5698-70ee-4fcb-b2d6-b7238bc3d4e6","Type":"ContainerDied","Data":"f669eeb73410ece75f1bb04ede07397829b8bb51d78a7e07322ea4b06a08ecb8"} Apr 20 22:02:14.214796 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.214698 2574 scope.go:117] "RemoveContainer" containerID="8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f" Apr 20 22:02:14.222896 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.222874 2574 scope.go:117] "RemoveContainer" containerID="8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f" Apr 20 22:02:14.223160 ip-10-0-140-110 kubenswrapper[2574]: E0420 22:02:14.223142 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f\": container with ID starting with 8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f not found: ID does not exist" containerID="8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f" Apr 20 22:02:14.223214 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.223167 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f"} err="failed to get container status \"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f\": rpc error: code = NotFound desc = could not find container \"8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f\": container with ID starting with 8262539acc7708af2e8e094075bd430e35d0d3aa41d00410cfc1ff52897ff44f not found: ID does not exist" Apr 20 22:02:14.234094 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.234069 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 22:02:14.237625 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.237603 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-57d97678df-v9bqv"] Apr 20 22:02:14.431671 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.431640 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" path="/var/lib/kubelet/pods/298f5698-70ee-4fcb-b2d6-b7238bc3d4e6/volumes" Apr 20 22:02:14.969995 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.969963 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-57d97678df-h4wrf"] Apr 20 22:02:14.970280 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970268 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a3ee233-4a16-4f4e-b4af-c93a965da896" containerName="authorino" Apr 20 22:02:14.970341 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970283 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3ee233-4a16-4f4e-b4af-c93a965da896" containerName="authorino" Apr 20 22:02:14.970341 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970294 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" containerName="manager" Apr 20 22:02:14.970341 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970299 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" containerName="manager" Apr 20 22:02:14.970464 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970355 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="298f5698-70ee-4fcb-b2d6-b7238bc3d4e6" containerName="manager" Apr 20 22:02:14.970464 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.970364 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a3ee233-4a16-4f4e-b4af-c93a965da896" containerName="authorino" Apr 20 22:02:14.974329 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.974313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:14.976061 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.976033 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-j2fgl\"" Apr 20 22:02:14.978884 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:14.978863 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57d97678df-h4wrf"] Apr 20 22:02:15.118323 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:15.118291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6vrv\" (UniqueName: \"kubernetes.io/projected/6ca35513-25fa-495b-b50d-87971a4984e4-kube-api-access-j6vrv\") pod \"maas-controller-57d97678df-h4wrf\" (UID: \"6ca35513-25fa-495b-b50d-87971a4984e4\") " pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:15.218749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:15.218709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6vrv\" (UniqueName: \"kubernetes.io/projected/6ca35513-25fa-495b-b50d-87971a4984e4-kube-api-access-j6vrv\") pod \"maas-controller-57d97678df-h4wrf\" (UID: \"6ca35513-25fa-495b-b50d-87971a4984e4\") " pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:15.226494 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:15.226431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6vrv\" (UniqueName: \"kubernetes.io/projected/6ca35513-25fa-495b-b50d-87971a4984e4-kube-api-access-j6vrv\") pod \"maas-controller-57d97678df-h4wrf\" (UID: \"6ca35513-25fa-495b-b50d-87971a4984e4\") " pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:15.285276 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:15.285243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:15.400433 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:15.400404 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-57d97678df-h4wrf"] Apr 20 22:02:15.402907 ip-10-0-140-110 kubenswrapper[2574]: W0420 22:02:15.402880 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ca35513_25fa_495b_b50d_87971a4984e4.slice/crio-d7eb1fd856b40e6eb579e8cd8f7d7638d8bbf649522771e542330e4b69e60a5f WatchSource:0}: Error finding container d7eb1fd856b40e6eb579e8cd8f7d7638d8bbf649522771e542330e4b69e60a5f: Status 404 returned error can't find the container with id d7eb1fd856b40e6eb579e8cd8f7d7638d8bbf649522771e542330e4b69e60a5f Apr 20 22:02:16.222828 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:16.222742 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-h4wrf" event={"ID":"6ca35513-25fa-495b-b50d-87971a4984e4","Type":"ContainerStarted","Data":"ea2af4ce8e4165653cb8d82faba2e2c0c5eafd2ff5eadc855ee9ef3443e02c2a"} Apr 20 22:02:16.222828 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:16.222776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-57d97678df-h4wrf" event={"ID":"6ca35513-25fa-495b-b50d-87971a4984e4","Type":"ContainerStarted","Data":"d7eb1fd856b40e6eb579e8cd8f7d7638d8bbf649522771e542330e4b69e60a5f"} Apr 20 22:02:16.222828 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:16.222800 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:02:16.238013 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:16.237959 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-57d97678df-h4wrf" podStartSLOduration=1.701022129 podStartE2EDuration="2.237944186s" podCreationTimestamp="2026-04-20 22:02:14 +0000 UTC" firstStartedPulling="2026-04-20 22:02:15.404170314 +0000 UTC m=+905.565024431" lastFinishedPulling="2026-04-20 22:02:15.941092369 +0000 UTC m=+906.101946488" observedRunningTime="2026-04-20 22:02:16.236294063 +0000 UTC m=+906.397148217" watchObservedRunningTime="2026-04-20 22:02:16.237944186 +0000 UTC m=+906.398798326" Apr 20 22:02:27.231900 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:02:27.231872 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-57d97678df-h4wrf" Apr 20 22:07:10.385939 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:07:10.385912 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:07:10.387211 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:07:10.387190 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:12:10.412023 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:12:10.411996 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:12:10.414502 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:12:10.413102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:17:10.436736 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:17:10.436619 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:17:10.440309 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:17:10.438458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:22:10.458336 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:22:10.458232 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:22:10.462430 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:22:10.461355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:23:11.432441 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:11.432407 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5979b6b9-vdn8z_84c88f93-7f08-4188-8989-9fee1fdc4df8/authorino/0.log" Apr 20 22:23:15.288057 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:15.288021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86f4dd7d58-5gqpr_3092ae0d-af00-4349-86b0-fe8c233ebbb4/maas-api/0.log" Apr 20 22:23:15.404039 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:15.404008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-57d97678df-h4wrf_6ca35513-25fa-495b-b50d-87971a4984e4/manager/0.log" Apr 20 22:23:15.878159 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:15.878123 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-f5f47469b-pqzwt_e80eab89-a43e-4d96-b8be-1bc48afb35f6/manager/0.log" Apr 20 22:23:17.231059 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:17.231028 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5979b6b9-vdn8z_84c88f93-7f08-4188-8989-9fee1fdc4df8/authorino/0.log" Apr 20 22:23:17.707123 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:17.707090 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-6kq45_f855ab67-bf36-4e85-9116-8a11da0acd4c/registry-server/0.log" Apr 20 22:23:18.078136 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:18.078055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2hn94_29b22937-44fb-43ee-8c4d-ceb0a606443c/manager/0.log" Apr 20 22:23:18.425524 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:18.425494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h_716e293f-d7b5-4987-87c9-9f07afbd37d3/istio-proxy/0.log" Apr 20 22:23:18.873883 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:18.873855 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-8rs64_2b3c675b-1eb0-4857-8212-055e3a3de56b/istio-proxy/0.log" Apr 20 22:23:19.453587 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.453556 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x_e5d2be12-e84a-4da0-8aac-0090aca11c46/storage-initializer/0.log" Apr 20 22:23:19.461663 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.461632 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-rpd5x_e5d2be12-e84a-4da0-8aac-0090aca11c46/main/0.log" Apr 20 22:23:19.568447 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.568407 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-w4qcj_b3001c74-34bd-49d8-9a51-85d1c4df58a4/storage-initializer/0.log" Apr 20 22:23:19.576116 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.576095 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-w4qcj_b3001c74-34bd-49d8-9a51-85d1c4df58a4/main/0.log" Apr 20 22:23:19.801749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.801671 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl_02db2c9e-660f-478b-863b-7dca60b9a8b1/storage-initializer/0.log" Apr 20 22:23:19.809100 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.809073 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-mpngl_02db2c9e-660f-478b-863b-7dca60b9a8b1/main/0.log" Apr 20 22:23:19.912615 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.912588 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6_3e1733d6-b767-45b5-ac8e-a388810e916b/storage-initializer/0.log" Apr 20 22:23:19.920075 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:19.920053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-cbxc6_3e1733d6-b767-45b5-ac8e-a388810e916b/main/0.log" Apr 20 22:23:26.206768 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:26.206726 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7524q_f57a85f5-bd23-4292-9e22-6f0078a7e4f0/global-pull-secret-syncer/0.log" Apr 20 22:23:26.354995 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:26.354961 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tgvx7_7548fad1-54fd-45fb-87f3-3c9b7d8d2573/konnectivity-agent/0.log" Apr 20 22:23:26.445405 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:26.445355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-110.ec2.internal_27ef09a0806330265539f634ef8e0e80/haproxy/0.log" Apr 20 22:23:30.893474 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:30.893440 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7d5979b6b9-vdn8z_84c88f93-7f08-4188-8989-9fee1fdc4df8/authorino/0.log" Apr 20 22:23:31.010678 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:31.010646 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-6kq45_f855ab67-bf36-4e85-9116-8a11da0acd4c/registry-server/0.log" Apr 20 22:23:31.201853 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:31.201769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-2hn94_29b22937-44fb-43ee-8c4d-ceb0a606443c/manager/0.log" Apr 20 22:23:33.018194 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:33.018168 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ccgvl_29c6b565-3905-48a9-b7e8-3853908ddeb8/node-exporter/0.log" Apr 20 22:23:33.036575 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:33.036554 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ccgvl_29c6b565-3905-48a9-b7e8-3853908ddeb8/kube-rbac-proxy/0.log" Apr 20 22:23:33.055165 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:33.055146 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ccgvl_29c6b565-3905-48a9-b7e8-3853908ddeb8/init-textfile/0.log" Apr 20 22:23:35.065302 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.065269 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp"] Apr 20 22:23:35.068888 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.068869 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.071136 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.071112 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5l2ks\"/\"default-dockercfg-wqjtp\"" Apr 20 22:23:35.071247 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.071113 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5l2ks\"/\"openshift-service-ca.crt\"" Apr 20 22:23:35.071609 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.071589 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5l2ks\"/\"kube-root-ca.crt\"" Apr 20 22:23:35.074816 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.074797 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp"] Apr 20 22:23:35.115618 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.115591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptq6x\" (UniqueName: \"kubernetes.io/projected/32436b7b-958b-439b-a586-c6df436958cc-kube-api-access-ptq6x\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.115760 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.115632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-proc\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.115760 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.115658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-sys\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.115760 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.115740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-lib-modules\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.115876 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.115773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-podres\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216387 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-lib-modules\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216566 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-podres\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216566 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptq6x\" (UniqueName: \"kubernetes.io/projected/32436b7b-958b-439b-a586-c6df436958cc-kube-api-access-ptq6x\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216566 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-proc\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216566 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-sys\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-lib-modules\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216577 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-sys\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216577 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-proc\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.216749 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.216582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/32436b7b-958b-439b-a586-c6df436958cc-podres\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.223854 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.223832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptq6x\" (UniqueName: \"kubernetes.io/projected/32436b7b-958b-439b-a586-c6df436958cc-kube-api-access-ptq6x\") pod \"perf-node-gather-daemonset-bdjxp\" (UID: \"32436b7b-958b-439b-a586-c6df436958cc\") " pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.379979 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.379953 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:35.495404 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.495331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp"] Apr 20 22:23:35.498001 ip-10-0-140-110 kubenswrapper[2574]: W0420 22:23:35.497960 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod32436b7b_958b_439b_a586_c6df436958cc.slice/crio-91a62d8dc9a1b163ae02fb9e3555238bca51e6e5b84956f6bd2282ff3c19d15c WatchSource:0}: Error finding container 91a62d8dc9a1b163ae02fb9e3555238bca51e6e5b84956f6bd2282ff3c19d15c: Status 404 returned error can't find the container with id 91a62d8dc9a1b163ae02fb9e3555238bca51e6e5b84956f6bd2282ff3c19d15c Apr 20 22:23:35.499517 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:35.499497 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:23:36.353170 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:36.353136 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" event={"ID":"32436b7b-958b-439b-a586-c6df436958cc","Type":"ContainerStarted","Data":"46b9e30871bb732004df7e88b2a4309acf98ec1ae928e7556eed2981a4a36459"} Apr 20 22:23:36.353170 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:36.353173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" event={"ID":"32436b7b-958b-439b-a586-c6df436958cc","Type":"ContainerStarted","Data":"91a62d8dc9a1b163ae02fb9e3555238bca51e6e5b84956f6bd2282ff3c19d15c"} Apr 20 22:23:36.353637 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:36.353225 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:36.367657 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:36.367602 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" podStartSLOduration=1.367588519 podStartE2EDuration="1.367588519s" podCreationTimestamp="2026-04-20 22:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:23:36.36608378 +0000 UTC m=+2186.526937921" watchObservedRunningTime="2026-04-20 22:23:36.367588519 +0000 UTC m=+2186.528442659" Apr 20 22:23:37.086429 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:37.086400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vb97h_2f2509d8-512e-4191-a295-3e79802650ac/dns/0.log" Apr 20 22:23:37.104902 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:37.104872 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vb97h_2f2509d8-512e-4191-a295-3e79802650ac/kube-rbac-proxy/0.log" Apr 20 22:23:37.124482 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:37.124453 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6drvt_3a1935ff-0056-494d-bd40-1316c97c620f/dns-node-resolver/0.log" Apr 20 22:23:37.669525 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:37.669490 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j5w8c_25ce781b-7c4c-499a-bc4a-2efb25261488/node-ca/0.log" Apr 20 22:23:38.418542 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:38.418513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfxw55h_716e293f-d7b5-4987-87c9-9f07afbd37d3/istio-proxy/0.log" Apr 20 22:23:38.640838 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:38.640812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-8rs64_2b3c675b-1eb0-4857-8212-055e3a3de56b/istio-proxy/0.log" Apr 20 22:23:39.168189 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:39.168157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gm5l7_bf0d97d7-a0a1-4f99-802a-39ac411ff714/serve-healthcheck-canary/0.log" Apr 20 22:23:39.734589 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:39.734556 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmbdw_fedb37b6-dbb2-4b57-ba53-813cae30c648/kube-rbac-proxy/0.log" Apr 20 22:23:39.752056 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:39.752027 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmbdw_fedb37b6-dbb2-4b57-ba53-813cae30c648/exporter/0.log" Apr 20 22:23:39.771262 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:39.771233 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmbdw_fedb37b6-dbb2-4b57-ba53-813cae30c648/extractor/0.log" Apr 20 22:23:41.719150 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:41.719117 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-86f4dd7d58-5gqpr_3092ae0d-af00-4349-86b0-fe8c233ebbb4/maas-api/0.log" Apr 20 22:23:41.793133 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:41.793102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-57d97678df-h4wrf_6ca35513-25fa-495b-b50d-87971a4984e4/manager/0.log" Apr 20 22:23:41.921177 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:41.921136 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-f5f47469b-pqzwt_e80eab89-a43e-4d96-b8be-1bc48afb35f6/manager/0.log" Apr 20 22:23:42.365840 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:42.365809 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5l2ks/perf-node-gather-daemonset-bdjxp" Apr 20 22:23:43.051976 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:43.051942 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-796667c6c8-g6npc_96507bea-c4f3-42ae-8e23-77ed0ddd303b/manager/0.log" Apr 20 22:23:49.153858 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.153824 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/kube-multus-additional-cni-plugins/0.log" Apr 20 22:23:49.172466 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.172443 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/egress-router-binary-copy/0.log" Apr 20 22:23:49.190312 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.190291 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/cni-plugins/0.log" Apr 20 22:23:49.208066 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.208045 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/bond-cni-plugin/0.log" Apr 20 22:23:49.228026 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.228005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/routeoverride-cni/0.log" Apr 20 22:23:49.247737 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.247716 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/whereabouts-cni-bincopy/0.log" Apr 20 22:23:49.267349 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.267329 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fxd5b_f2958395-eab5-4338-b6d6-170a01a66c73/whereabouts-cni/0.log" Apr 20 22:23:49.301079 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.301053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kpm8f_1cf66444-7265-4d80-80d8-107f0de4d0db/kube-multus/0.log" Apr 20 22:23:49.451829 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.451755 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xmrt9_a08eea80-f553-4499-a8dc-94c9591d8221/network-metrics-daemon/0.log" Apr 20 22:23:49.468243 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:49.468212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xmrt9_a08eea80-f553-4499-a8dc-94c9591d8221/kube-rbac-proxy/0.log" Apr 20 22:23:50.481961 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.481926 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-controller/0.log" Apr 20 22:23:50.500198 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.500159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/0.log" Apr 20 22:23:50.519809 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.519767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovn-acl-logging/1.log" Apr 20 22:23:50.541089 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.541045 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/kube-rbac-proxy-node/0.log" Apr 20 22:23:50.560651 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.560623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 22:23:50.575567 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.575545 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/northd/0.log" Apr 20 22:23:50.593072 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.593042 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/nbdb/0.log" Apr 20 22:23:50.611173 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.611153 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/sbdb/0.log" Apr 20 22:23:50.797163 ip-10-0-140-110 kubenswrapper[2574]: I0420 22:23:50.797135 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mlfps_a3605f9a-a9e1-40d9-ab62-917e4aca6f0c/ovnkube-controller/0.log"